diff --git a/contrib/attic/arbiter/README.md b/contrib/attic/arbiter/README.md deleted file mode 100644 index 67124f30a..000000000 --- a/contrib/attic/arbiter/README.md +++ /dev/null @@ -1,45 +0,0 @@ -# Arbiter - -A tool dedicated to tuning (hyperparameter optimization) of machine learning models. Part of the DL4J Suite of Machine Learning / Deep Learning tools for the enterprise. - - -## Modules -Arbiter contains the following modules: - -- arbiter-core: Defines the API and core functionality, and also contains functionality for the Arbiter UI -- arbiter-deeplearning4j: For hyperparameter optimization of DL4J models (MultiLayerNetwork and ComputationGraph networks) - - -## Hyperparameter Optimization Functionality - -The open-source version of Arbiter currently defines two methods of hyperparameter optimization: - -- Grid search -- Random search - -For optimization of complex models such as neural networks (those with more than a few hyperparameters), random search is superior to grid search, though Bayesian hyperparameter optimization schemes -For a comparison of random and grid search methods, see [Random Search for Hyper-parameter Optimization (Bergstra and Bengio, 2012)](http://www.jmlr.org/papers/volume13/bergstra12a/bergstra12a.pdf). - -### Core Concepts and Classes in Arbiter for Hyperparameter Optimization - -In order to conduct hyperparameter optimization in Arbiter, it is necessary for the user to understand and define the following: - -- **Parameter Space**: A ```ParameterSpace

``` specifies the type and allowable values of hyperparameters for a model configuration of type ```P```. For example, ```P``` could be a MultiLayerConfiguration for DL4J -- **Candidate Generator**: A ```CandidateGenerator``` is used to generate candidate models configurations of some type ```C```. The following implementations are defined in arbiter-core: - - ```RandomSearchCandidateGenerator``` - - ```GridSearchCandidateGenerator``` -- **Score Function**: A ```ScoreFunction``` is used to score a model of type ```M``` given data of type ```D```. For example, in DL4J a score function might be used to calculate the classification accuracy from a DataSetIterator - - A key concept here is that they score is a single numerical (double precision) value that we either want to minimize or maximize - this is the goal of hyperparameter optimization -- **Termination Conditions**: One or more ```TerminationCondition``` instances must be provided to the ```OptimizationConfiguration```. ```TerminationCondition``` instances are used to control when hyperparameter optimization should be stopped. Some built-in termination conditions: - - ```MaxCandidatesCondition```: Terminate if more than the specified number of candidate hyperparameter configurations have been executed - - ```MaxTimeCondition```: Terminate after a specified amount of time has elapsed since starting the optimization -- **Result Saver**: The ```ResultSaver``` interface is used to specify how the results of each hyperparameter optimization run should be saved. For example, whether saving should be done to local disk, to a database, to HDFS, or simply stored in memory. - - Note that ```ResultSaver.saveModel``` method returns a ```ResultReference``` object, which provides a mechanism for re-loading both the model and score from wherever it may be saved. -- **Optimization Configuration**: An ```OptimizationConfiguration``` ties together the above configuration options in a fluent (builder) pattern. -- **Candidate Executor**: The ```CandidateExecutor``` interface provides a layer of abstraction between the configuration and execution of each instance of learning. Currently, the only option is the ```LocalCandidateExecutor```, which is used to execute learning on a single machine (in the current JVM). In principle, other execution methods (for example, on Spark or cloud computing machines) could be implemented. -- **Optimization Runner**: The ```OptimizationRunner``` uses an ```OptimizationConfiguration``` and a ```CandidateExecutor``` to actually run the optimization, and save the results. - - -### Optimization of DeepLearning4J Models - -(This section: forthcoming) diff --git a/contrib/attic/arbiter/arbiter-core/pom.xml b/contrib/attic/arbiter/arbiter-core/pom.xml deleted file mode 100644 index b23631008..000000000 --- a/contrib/attic/arbiter/arbiter-core/pom.xml +++ /dev/null @@ -1,89 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - arbiter - 1.0.0-SNAPSHOT - - - arbiter-core - - arbiter-core - - - - org.nd4j - nd4j-api - - - com.google.code.findbugs - * - - - - - org.apache.commons - commons-lang3 - - - org.apache.commons - commons-math3 - - - org.slf4j - slf4j-api - - - joda-time - joda-time - - - - org.nd4j - jackson - - - org.nd4j - guava - - - commons-codec - commons-codec - ${commons-codec.version} - - - - - - test-nd4j-native - - - test-nd4j-cuda-11.0 - - - diff --git a/contrib/attic/arbiter/arbiter-core/src/assembly/bin.xml b/contrib/attic/arbiter/arbiter-core/src/assembly/bin.xml deleted file mode 100644 index bf1779543..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/assembly/bin.xml +++ /dev/null @@ -1,95 +0,0 @@ - - - - bin - - - tar.gz - - - - - - - lib - - *:jar:* - - - *:sources - - - - - - - - - readme.txt - - - - - src/main/resources/bin/ - bin - - arbiter - - unix - 0755 - - - - examples - examples - - - - - - - - target - ./ - - *.jar - - - - - - \ No newline at end of file diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/AbstractParameterSpace.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/AbstractParameterSpace.java deleted file mode 100644 index 6591349f7..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/AbstractParameterSpace.java +++ /dev/null @@ -1,76 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api; - -import java.lang.reflect.Field; -import java.util.ArrayList; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Map; - -/** - * Created by Alex on 23/07/2017. - */ -public abstract class AbstractParameterSpace implements ParameterSpace { - - @Override - public Map getNestedSpaces() { - Map m = new LinkedHashMap<>(); - - //Need to manually build and walk the class hierarchy... - Class currClass = this.getClass(); - List> classHierarchy = new ArrayList<>(); - while (currClass != Object.class) { - classHierarchy.add(currClass); - currClass = currClass.getSuperclass(); - } - - for (int i = classHierarchy.size() - 1; i >= 0; i--) { - //Use reflection here to avoid a mass of boilerplate code... - Field[] allFields = classHierarchy.get(i).getDeclaredFields(); - - for (Field f : allFields) { - - String name = f.getName(); - Class fieldClass = f.getType(); - boolean isParamSpacefield = ParameterSpace.class.isAssignableFrom(fieldClass); - - if (!isParamSpacefield) { - continue; - } - - f.setAccessible(true); - - ParameterSpace p; - try { - p = (ParameterSpace) f.get(this); - } catch (IllegalAccessException e) { - throw new RuntimeException(e); - } - - if (p != null) { - m.put(name, p); - } - } - } - - return m; - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/Candidate.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/Candidate.java deleted file mode 100644 index 7aa986768..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/Candidate.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api; - -import lombok.AllArgsConstructor; -import lombok.Data; -import org.deeplearning4j.arbiter.optimize.generator.util.SerializedSupplier; -import org.nd4j.common.function.Supplier; - -import java.io.Serializable; -import java.util.Map; - -/** - * Candidate: a proposed hyperparameter configuration. - * Also includes a map for data parameters, to configure things like data preprocessing, etc. - */ -@Data -@AllArgsConstructor -public class Candidate implements Serializable { - - private Supplier supplier; - private int index; - private double[] flatParameters; - private Map dataParameters; - private Exception exception; - - public Candidate(C value, int index, double[] flatParameters, Map dataParameters, Exception e) { - this(new SerializedSupplier(value), index, flatParameters, dataParameters, e); - } - - public Candidate(C value, int index, double[] flatParameters) { - this(new SerializedSupplier(value), index, flatParameters); - } - - public Candidate(Supplier value, int index, double[] flatParameters) { - this(value, index, flatParameters, null, null); - } - - public C getValue(){ - return supplier.get(); - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/CandidateGenerator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/CandidateGenerator.java deleted file mode 100644 index 91dda0521..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/CandidateGenerator.java +++ /dev/null @@ -1,67 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api; - -import org.nd4j.shade.jackson.annotation.JsonInclude; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -/** - * A CandidateGenerator proposes candidates (i.e., hyperparameter configurations) for evaluation. - * This abstraction allows for different ways of generating the next configuration to test; for example, - * random search, grid search, Bayesian optimization methods, etc. - * - * @author Alex Black - */ -@JsonInclude(JsonInclude.Include.NON_NULL) -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -public interface CandidateGenerator { - - /** - * Is this candidate generator able to generate more candidates? This will always return true in some - * cases, but some search strategies have a limit (grid search, for example) - */ - boolean hasMoreCandidates(); - - /** - * Generate a candidate hyperparameter configuration - */ - Candidate getCandidate(); - - /** - * Report results for the candidate generator. - * - * @param result The results to report - */ - void reportResults(OptimizationResult result); - - /** - * @return Get the parameter space for this candidate generator - */ - ParameterSpace getParameterSpace(); - - /** - * @param rngSeed Set the random number generator seed for the candidate generator - */ - void setRngSeed(long rngSeed); - - /** - * @return The type (class) of the generated candidates - */ - Class getCandidateType(); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/OptimizationResult.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/OptimizationResult.java deleted file mode 100644 index 9e0316bdf..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/OptimizationResult.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api; - -import lombok.Data; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.nd4j.shade.jackson.annotation.JsonIgnoreProperties; -import org.nd4j.shade.jackson.annotation.JsonProperty; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -import java.io.Serializable; - -/** - * An optimization result represents the results of an optimization run, including the candidate configuration, the - * trained model, the score for that model, and index of the model - * - * @author Alex Black - */ -@Data -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -@JsonIgnoreProperties({"resultReference"}) -public class OptimizationResult implements Serializable { - @JsonProperty - private Candidate candidate; - @JsonProperty - private Double score; - @JsonProperty - private int index; - @JsonProperty - private Object modelSpecificResults; - @JsonProperty - private CandidateInfo candidateInfo; - private ResultReference resultReference; - - - public OptimizationResult(Candidate candidate, Double score, int index, Object modelSpecificResults, - CandidateInfo candidateInfo, ResultReference resultReference) { - this.candidate = candidate; - this.score = score; - this.index = index; - this.modelSpecificResults = modelSpecificResults; - this.candidateInfo = candidateInfo; - this.resultReference = resultReference; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/ParameterSpace.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/ParameterSpace.java deleted file mode 100644 index ca3731cf4..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/ParameterSpace.java +++ /dev/null @@ -1,83 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api; - -import org.nd4j.shade.jackson.annotation.JsonIgnore; -import org.nd4j.shade.jackson.annotation.JsonInclude; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -import java.util.List; -import java.util.Map; - -/** - * ParameterSpace: defines the acceptable ranges of values a given parameter may take. - * Note that parameter spaces can be simple (like {@code ParameterSpace}) or complicated, including - * multiple nested ParameterSpaces - * - * @author Alex Black - */ -@JsonInclude(JsonInclude.Include.NON_NULL) -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -public interface ParameterSpace

{ - - /** - * Generate a candidate given a set of values. These values are then mapped to a specific candidate, using some - * mapping function (such as the prior probability distribution) - * - * @param parameterValues A set of values, each in the range [0,1], of length {@link #numParameters()} - */ - P getValue(double[] parameterValues); - - /** - * Get the total number of parameters (hyperparameters) to be optimized. This includes optional parameters from - * different parameter subpaces. (Thus, not every parameter may be used in every candidate) - * - * @return Number of hyperparameters to be optimized - */ - int numParameters(); - - /** - * Collect a list of parameters, recursively. Note that leaf parameters are parameters that do not have any - * nested parameter spaces - */ - List collectLeaves(); - - /** - * Get a list of nested parameter spaces by name. Note that the returned parameter spaces may in turn have further - * nested parameter spaces. The map should be empty for leaf parameter spaces - * - * @return A map of nested parameter spaces - */ - Map getNestedSpaces(); - - /** - * Is this ParameterSpace a leaf? (i.e., does it contain other ParameterSpaces internally?) - */ - @JsonIgnore - boolean isLeaf(); - - /** - * For leaf ParameterSpaces: set the indices of the leaf ParameterSpace. - * Expects input of length {@link #numParameters()}. Throws exception if {@link #isLeaf()} is false. - * - * @param indices Indices to set. Length should equal {@link #numParameters()} - */ - void setIndices(int... indices); - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/TaskCreator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/TaskCreator.java deleted file mode 100644 index 6be295349..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/TaskCreator.java +++ /dev/null @@ -1,64 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api; - -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; - -import java.util.List; -import java.util.Properties; -import java.util.concurrent.Callable; - -/** - * The TaskCreator is used to take a candidate configuration, data provider and score function, and create something - * that can be executed as a Callable - * - * @author Alex Black - */ -public interface TaskCreator { - - /** - * Generate a callable that can be executed to conduct the training of this model (given the model configuration) - * - * @param candidate Candidate (model) configuration to be trained - * @param dataProvider DataProvider, for the data - * @param scoreFunction Score function to be used to evaluate the model - * @param statusListeners Status listeners, that can be used for callbacks (to UI, for example) - * @return A callable that returns an OptimizationResult, once optimization is complete - */ - @Deprecated - Callable create(Candidate candidate, DataProvider dataProvider, ScoreFunction scoreFunction, - List statusListeners, IOptimizationRunner runner); - - /** - * Generate a callable that can be executed to conduct the training of this model (given the model configuration) - * - * @param candidate Candidate (model) configuration to be trained - * @param dataSource Data source - * @param dataSourceProperties Properties (may be null) for the data source - * @param scoreFunction Score function to be used to evaluate the model - * @param statusListeners Status listeners, that can be used for callbacks (to UI, for example) - * @return A callable that returns an OptimizationResult, once optimization is complete - */ - Callable create(Candidate candidate, Class dataSource, Properties dataSourceProperties, - ScoreFunction scoreFunction, List statusListeners, IOptimizationRunner runner); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/TaskCreatorProvider.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/TaskCreatorProvider.java deleted file mode 100644 index b9b0c0cde..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/TaskCreatorProvider.java +++ /dev/null @@ -1,45 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api; - -import java.util.HashMap; -import java.util.Map; - -public class TaskCreatorProvider { - - private static Map, Class> map = new HashMap<>(); - - public synchronized static TaskCreator defaultTaskCreatorFor(Class paramSpaceClass){ - Class c = map.get(paramSpaceClass); - try { - if(c == null){ - return null; - } - return c.newInstance(); - } catch (Exception e){ - throw new RuntimeException("Could not create new instance of task creator class: " + c + " - missing no-arg constructor?", e); - } - } - - public synchronized static void registerDefaultTaskCreatorClass(Class spaceClass, - Class creatorClass){ - map.put(spaceClass, creatorClass); - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/adapter/ParameterSpaceAdapter.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/adapter/ParameterSpaceAdapter.java deleted file mode 100644 index 5fb513f13..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/adapter/ParameterSpaceAdapter.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.adapter; - -import lombok.AllArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -/** - * An abstract class used for adapting one type into another. Subclasses of this need to merely implement 2 simple methods - * - * @param Type to convert from - * @param Type to convert to - * @author Alex Black - */ -@AllArgsConstructor -public abstract class ParameterSpaceAdapter implements ParameterSpace { - - - protected abstract T convertValue(F from); - - protected abstract ParameterSpace underlying(); - - protected abstract String underlyingName(); - - - @Override - public T getValue(double[] parameterValues) { - return convertValue(underlying().getValue(parameterValues)); - } - - @Override - public int numParameters() { - return underlying().numParameters(); - } - - @Override - public List collectLeaves() { - ParameterSpace p = underlying(); - if(p.isLeaf()){ - return Collections.singletonList(p); - } - return underlying().collectLeaves(); - } - - @Override - public Map getNestedSpaces() { - return Collections.singletonMap(underlyingName(), (ParameterSpace)underlying()); - } - - @Override - public boolean isLeaf() { - return false; //Underlying may be a leaf, however - } - - @Override - public void setIndices(int... indices) { - underlying().setIndices(indices); - } - - @Override - public String toString() { - return underlying().toString(); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataProvider.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataProvider.java deleted file mode 100644 index 60f4e7ed8..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataProvider.java +++ /dev/null @@ -1,56 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.data; - -import org.nd4j.shade.jackson.annotation.JsonInclude; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -import java.io.Serializable; -import java.util.Map; - -/** - * DataProvider interface abstracts out the providing of data - * @deprecated Use {@link DataSource} - */ -@JsonInclude(JsonInclude.Include.NON_NULL) -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -@Deprecated -public interface DataProvider extends Serializable { - - /** - * Get training data given some parameters for the data. - * Data parameters map is used to specify things like batch - * size data preprocessing - * - * @param dataParameters Parameters for data. May be null or empty for default data - * @return training data - */ - Object trainData(Map dataParameters); - - /** - * Get training data given some parameters for the data. Data parameters map is used to specify things like batch - * size data preprocessing - * - * @param dataParameters Parameters for data. May be null or empty for default data - * @return training data - */ - Object testData(Map dataParameters); - - Class getDataType(); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataSetIteratorFactoryProvider.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataSetIteratorFactoryProvider.java deleted file mode 100644 index 63f0de495..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataSetIteratorFactoryProvider.java +++ /dev/null @@ -1,91 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.data; - -import lombok.Data; -import org.nd4j.linalg.dataset.api.iterator.DataSetIteratorFactory; - -import java.util.Map; - -/** - * This is a {@link DataProvider} for - * an {@link DataSetIteratorFactory} which - * based on a key of {@link DataSetIteratorFactoryProvider#FACTORY_KEY} - * will create {@link org.nd4j.linalg.dataset.api.iterator.DataSetIterator} - * for use with arbiter. - * - * This {@link DataProvider} is mainly meant for use for command line driven - * applications. - * - * @author Adam Gibson - */ -@Data -public class DataSetIteratorFactoryProvider implements DataProvider { - - public final static String FACTORY_KEY = "org.deeplearning4j.arbiter.data.data.factory"; - - /** - * Get training data given some parameters for the data. - * Data parameters map is used to specify things like batch - * size data preprocessing - * - * @param dataParameters Parameters for data. May be null or empty for default data - * @return training data - */ - @Override - public DataSetIteratorFactory trainData(Map dataParameters) { - return create(dataParameters); - } - - /** - * Get training data given some parameters for the data. Data parameters map - * is used to specify things like batch - * size data preprocessing - * - * @param dataParameters Parameters for data. May be null or empty for default data - * @return training data - */ - @Override - public DataSetIteratorFactory testData(Map dataParameters) { - return create(dataParameters); - } - - @Override - public Class getDataType() { - return DataSetIteratorFactory.class; - } - - private DataSetIteratorFactory create(Map dataParameters) { - if (dataParameters == null) - throw new IllegalArgumentException( - "Data parameters is null. Please specify a class name to create a dataset iterator."); - if (!dataParameters.containsKey(FACTORY_KEY)) - throw new IllegalArgumentException( - "No data set iterator factory class found. Please specify a class name with key " - + FACTORY_KEY); - String value = dataParameters.get(FACTORY_KEY).toString(); - try { - Class clazz = - (Class) Class.forName(value); - return clazz.newInstance(); - } catch (Exception e) { - throw new RuntimeException("Could not create DataSetIteratorFactory instance - missing no-arg constructor?", e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataSource.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataSource.java deleted file mode 100644 index dc2b6effc..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/data/DataSource.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.data; - -import java.io.Serializable; -import java.util.Properties; - -/** - * DataSource: defines where the data should come from for training and testing. - * Note that implementations must have a no-argument contsructor - * - * @author Alex Black - */ -public interface DataSource extends Serializable { - - /** - * Configure the current data source with the specified properties - * Note: These properties are fixed for the training instance, and are optionally provided by the user - * at the configuration stage. - * The properties could be anything - and are usually specific to each DataSource implementation. - * For example, values such as batch size could be set using these properties - * @param properties Properties to apply to the data source instance - */ - void configure(Properties properties); - - /** - * Get test data to be used for the optimization. Usually a DataSetIterator or MultiDataSetIterator - */ - Object trainData(); - - /** - * Get test data to be used for the optimization. Usually a DataSetIterator or MultiDataSetIterator - */ - Object testData(); - - /** - * The type of data returned by {@link #trainData()} and {@link #testData()}. - * Usually DataSetIterator or MultiDataSetIterator - * @return Class of the objects returned by trainData and testData - */ - Class getDataType(); - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/evaluation/ModelEvaluator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/evaluation/ModelEvaluator.java deleted file mode 100644 index 108b2fb9f..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/evaluation/ModelEvaluator.java +++ /dev/null @@ -1,42 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.evaluation; - -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; - -import java.io.Serializable; -import java.util.List; - -/** - * ModelEvaluator: Used to conduct additional evaluation. - * For example, this may be classification performance on a test set or similar - */ -public interface ModelEvaluator extends Serializable { - Object evaluateModel(Object model, DataProvider dataProvider); - - /** - * @return The model types supported by this class - */ - List> getSupportedModelTypes(); - - /** - * @return The datatypes supported by this class - */ - List> getSupportedDataTypes(); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/InMemoryResultSaver.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/InMemoryResultSaver.java deleted file mode 100644 index 8f75d1196..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/InMemoryResultSaver.java +++ /dev/null @@ -1,65 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.saving; - -import lombok.AllArgsConstructor; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; - -import java.io.IOException; -import java.util.Collections; -import java.util.List; - -/** - * A simple class to store optimization results in-memory. - * Not recommended for large (or a large number of) models. - */ -@NoArgsConstructor -public class InMemoryResultSaver implements ResultSaver { - @Override - public ResultReference saveModel(OptimizationResult result, Object modelResult) throws IOException { - return new InMemoryResult(result, modelResult); - } - - @Override - public List> getSupportedCandidateTypes() { - return Collections.>singletonList(Object.class); - } - - @Override - public List> getSupportedModelTypes() { - return Collections.>singletonList(Object.class); - } - - @AllArgsConstructor - private static class InMemoryResult implements ResultReference { - private OptimizationResult result; - private Object modelResult; - - @Override - public OptimizationResult getResult() throws IOException { - return result; - } - - @Override - public Object getResultModel() throws IOException { - return modelResult; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/ResultReference.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/ResultReference.java deleted file mode 100644 index d05c34baf..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/ResultReference.java +++ /dev/null @@ -1,39 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.saving; - -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -import java.io.IOException; - -/** - * Idea: We can't store all results in memory in general (might have thousands of candidates with millions of - * parameters each) - * So instead: return a reference to the saved result. Idea is that the result may be saved to disk or a database, - * and we can easily load it back into memory (if/when required) using the getResult() method - */ -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -public interface ResultReference { - - OptimizationResult getResult() throws IOException; - - Object getResultModel() throws IOException; - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/ResultSaver.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/ResultSaver.java deleted file mode 100644 index 66a430baf..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/saving/ResultSaver.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.saving; - -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.nd4j.shade.jackson.annotation.JsonInclude; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -import java.io.IOException; -import java.util.List; - -/** - * The ResultSaver interface provides a means of saving models in such a way that they can be loaded back into memory later, - * regardless of where/how they are saved. - * - * @author Alex Black - */ -@JsonInclude(JsonInclude.Include.NON_NULL) -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -public interface ResultSaver { - - /** - * Save the model (including configuration and any additional evaluation/results) - * - * @param result Optimization result for the model to save - * @param modelResult Model result to save - * @return ResultReference, such that the result can be loaded back into memory - * @throws IOException If IO error occurs during model saving - */ - ResultReference saveModel(OptimizationResult result, Object modelResult) throws IOException; - - /** - * @return The candidate types supported by this class - */ - List> getSupportedCandidateTypes(); - - /** - * @return The model types supported by this class - */ - List> getSupportedModelTypes(); - - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/score/ScoreFunction.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/score/ScoreFunction.java deleted file mode 100644 index 62b6f74fe..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/score/ScoreFunction.java +++ /dev/null @@ -1,77 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.score; - -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.nd4j.shade.jackson.annotation.JsonInclude; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -import java.io.Serializable; -import java.util.List; -import java.util.Map; -import java.util.Properties; - -/** - * ScoreFunction defines the objective of hyperparameter optimization. - * Specifically, it is used to calculate a score for a given model, relative to the data set provided - * in the configuration. - * - */ -@JsonInclude(JsonInclude.Include.NON_NULL) -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -public interface ScoreFunction extends Serializable { - - /** - * Calculate and return the score, for the given model and data provider - * - * @param model Model to score - * @param dataProvider Data provider - data to use - * @param dataParameters Parameters for data - * @return Calculated score - */ - double score(Object model, DataProvider dataProvider, Map dataParameters); - - /** - * Calculate and return the score, for the given model and data provider - * - * @param model Model to score - * @param dataSource Data source - * @param dataSourceProperties data source properties - * @return Calculated score - */ - double score(Object model, Class dataSource, Properties dataSourceProperties); - - /** - * Should this score function be minimized or maximized? - * - * @return true if score should be minimized, false if score should be maximized - */ - boolean minimize(); - - /** - * @return The model types supported by this class - */ - List> getSupportedModelTypes(); - - /** - * @return The data types supported by this class - */ - List> getSupportedDataTypes(); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/MaxCandidatesCondition.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/MaxCandidatesCondition.java deleted file mode 100644 index 7c49e9ff3..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/MaxCandidatesCondition.java +++ /dev/null @@ -1,52 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.termination; - -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -/** - * Terminate hyperparameter search when the number of candidates exceeds a specified value. - * Note that this is counted as number of completed candidates, plus number of failed candidates. - */ -@AllArgsConstructor -@NoArgsConstructor -@Data -public class MaxCandidatesCondition implements TerminationCondition { - @JsonProperty - private int maxCandidates; - - @Override - public void initialize(IOptimizationRunner optimizationRunner) { - //No op - } - - @Override - public boolean terminate(IOptimizationRunner optimizationRunner) { - return optimizationRunner.numCandidatesCompleted() + optimizationRunner.numCandidatesFailed() >= maxCandidates; - } - - @Override - public String toString() { - return "MaxCandidatesCondition(" + maxCandidates + ")"; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/MaxTimeCondition.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/MaxTimeCondition.java deleted file mode 100644 index 05846ffc4..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/MaxTimeCondition.java +++ /dev/null @@ -1,83 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.termination; - -import lombok.Data; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.joda.time.format.DateTimeFormat; -import org.joda.time.format.DateTimeFormatter; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.util.concurrent.TimeUnit; - -/** - * Terminate hyperparameter optimization after - * a fixed amount of time has passed - * @author Alex Black - */ -@NoArgsConstructor -@Data -public class MaxTimeCondition implements TerminationCondition { - private static final DateTimeFormatter formatter = DateTimeFormat.forPattern("dd-MMM HH:mm ZZ"); - - private long duration; - private TimeUnit timeUnit; - private long startTime; - private long endTime; - - - private MaxTimeCondition(@JsonProperty("duration") long duration, @JsonProperty("timeUnit") TimeUnit timeUnit, - @JsonProperty("startTime") long startTime, @JsonProperty("endTime") long endTime) { - this.duration = duration; - this.timeUnit = timeUnit; - this.startTime = startTime; - this.endTime = endTime; - } - - /** - * @param duration Duration of time - * @param timeUnit Unit that the duration is specified in - */ - public MaxTimeCondition(long duration, TimeUnit timeUnit) { - this.duration = duration; - this.timeUnit = timeUnit; - } - - @Override - public void initialize(IOptimizationRunner optimizationRunner) { - startTime = System.currentTimeMillis(); - this.endTime = startTime + timeUnit.toMillis(duration); - } - - @Override - public boolean terminate(IOptimizationRunner optimizationRunner) { - return System.currentTimeMillis() >= endTime; - } - - @Override - public String toString() { - if (startTime > 0) { - return "MaxTimeCondition(" + duration + "," + timeUnit + ",start=\"" + formatter.print(startTime) - + "\",end=\"" + formatter.print(endTime) + "\")"; - } else { - return "MaxTimeCondition(" + duration + "," + timeUnit + "\")"; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/TerminationCondition.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/TerminationCondition.java deleted file mode 100644 index fb3aa7487..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/api/termination/TerminationCondition.java +++ /dev/null @@ -1,47 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.api.termination; - - -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.nd4j.shade.jackson.annotation.JsonInclude; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -/** - * Global termination condition for conducting hyperparameter optimization. - * Termination conditions are used to determine if/when the optimization should stop. - */ -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -@JsonInclude(JsonInclude.Include.NON_NULL) -public interface TerminationCondition { - - /** - * Initialize the termination condition (such as starting timers, etc). - */ - void initialize(IOptimizationRunner optimizationRunner); - - /** - * Determine whether optimization should be terminated - * - * @param optimizationRunner Optimization runner - * @return true if learning should be terminated, false otherwise - */ - boolean terminate(IOptimizationRunner optimizationRunner); - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/config/OptimizationConfiguration.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/config/OptimizationConfiguration.java deleted file mode 100644 index db70ad318..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/config/OptimizationConfiguration.java +++ /dev/null @@ -1,223 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.config; - -import lombok.*; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultSaver; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.api.termination.TerminationCondition; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; -import org.nd4j.shade.jackson.core.JsonProcessingException; -import org.nd4j.shade.jackson.databind.annotation.JsonSerialize; - -import java.io.IOException; -import java.util.Arrays; -import java.util.List; -import java.util.Properties; - -/** - * OptimizationConfiguration ties together all of the various - * components (such as data, score functions, result saving etc) - * required to execute hyperparameter optimization. - * - * @author Alex Black - */ -@Data -@NoArgsConstructor -@EqualsAndHashCode(exclude = {"dataProvider", "terminationConditions", "candidateGenerator", "resultSaver"}) -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -public class OptimizationConfiguration { - @JsonSerialize - private DataProvider dataProvider; - @JsonSerialize - private Class dataSource; - @JsonSerialize - private Properties dataSourceProperties; - @JsonSerialize - private CandidateGenerator candidateGenerator; - @JsonSerialize - private ResultSaver resultSaver; - @JsonSerialize - private ScoreFunction scoreFunction; - @JsonSerialize - private List terminationConditions; - @JsonSerialize - private Long rngSeed; - - @Getter - @Setter - private long executionStartTime; - - - private OptimizationConfiguration(Builder builder) { - this.dataProvider = builder.dataProvider; - this.dataSource = builder.dataSource; - this.dataSourceProperties = builder.dataSourceProperties; - this.candidateGenerator = builder.candidateGenerator; - this.resultSaver = builder.resultSaver; - this.scoreFunction = builder.scoreFunction; - this.terminationConditions = builder.terminationConditions; - this.rngSeed = builder.rngSeed; - - if (rngSeed != null) - candidateGenerator.setRngSeed(rngSeed); - - //Validate the configuration: data types, score types, etc - //TODO - - //Validate that the dataSource has a no-arg constructor - if(dataSource != null){ - try{ - dataSource.getConstructor(); - } catch (NoSuchMethodException e){ - throw new IllegalStateException("Data source class " + dataSource.getName() + " does not have a public no-argument constructor"); - } - } - } - - public static class Builder { - - private DataProvider dataProvider; - private Class dataSource; - private Properties dataSourceProperties; - private CandidateGenerator candidateGenerator; - private ResultSaver resultSaver; - private ScoreFunction scoreFunction; - private List terminationConditions; - private Long rngSeed; - - /** - * @deprecated Use {@link #dataSource(Class, Properties)} - */ - @Deprecated - public Builder dataProvider(DataProvider dataProvider) { - this.dataProvider = dataProvider; - return this; - } - - /** - * DataSource: defines where the data should come from for training and testing. - * Note that implementations must have a no-argument constructor - * @param dataSource Class for the data source - * @param dataSourceProperties May be null. Properties for configuring the data source - */ - public Builder dataSource(Class dataSource, Properties dataSourceProperties){ - this.dataSource = dataSource; - this.dataSourceProperties = dataSourceProperties; - return this; - } - - public Builder candidateGenerator(CandidateGenerator candidateGenerator) { - this.candidateGenerator = candidateGenerator; - return this; - } - - public Builder modelSaver(ResultSaver resultSaver) { - this.resultSaver = resultSaver; - return this; - } - - public Builder scoreFunction(ScoreFunction scoreFunction) { - this.scoreFunction = scoreFunction; - return this; - } - - /** - * Termination conditions to use - * @param conditions - * @return - */ - public Builder terminationConditions(TerminationCondition... conditions) { - terminationConditions = Arrays.asList(conditions); - return this; - } - - public Builder terminationConditions(List terminationConditions) { - this.terminationConditions = terminationConditions; - return this; - } - - public Builder rngSeed(long rngSeed) { - this.rngSeed = rngSeed; - return this; - } - - public OptimizationConfiguration build() { - return new OptimizationConfiguration(this); - } - } - - - /** - * Create an optimization configuration from the json - * @param json the json to create the config from - * For type definitions - * @see OptimizationConfiguration - */ - public static OptimizationConfiguration fromYaml(String json) { - try { - return JsonMapper.getYamlMapper().readValue(json, OptimizationConfiguration.class); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - /** - * Create an optimization configuration from the json - * @param json the json to create the config from - * @see OptimizationConfiguration - */ - public static OptimizationConfiguration fromJson(String json) { - try { - return JsonMapper.getMapper().readValue(json, OptimizationConfiguration.class); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - /** - * Return a json configuration of this optimization configuration - * - * @return - */ - public String toJson() { - try { - return JsonMapper.getMapper().writeValueAsString(this); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } - - /** - * Return a yaml configuration of this optimization configuration - * - * @return - */ - public String toYaml() { - try { - return JsonMapper.getYamlMapper().writeValueAsString(this); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/DegenerateIntegerDistribution.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/DegenerateIntegerDistribution.java deleted file mode 100644 index 6a5551e35..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/DegenerateIntegerDistribution.java +++ /dev/null @@ -1,99 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.distribution; - -import org.apache.commons.math3.distribution.IntegerDistribution; -import org.apache.commons.math3.exception.NumberIsTooLargeException; -import org.apache.commons.math3.exception.OutOfRangeException; - -import java.util.Arrays; - -/** - * Degenerate distribution: i.e., integer "distribution" that is just a fixed value - */ -public class DegenerateIntegerDistribution implements IntegerDistribution { - private int value; - - public DegenerateIntegerDistribution(int value) { - this.value = value; - } - - - @Override - public double probability(int x) { - return (x == value ? 1.0 : 0.0); - } - - @Override - public double cumulativeProbability(int x) { - return (x >= value ? 1.0 : 0.0); - } - - @Override - public double cumulativeProbability(int x0, int x1) throws NumberIsTooLargeException { - return (value >= x0 && value <= x1 ? 1.0 : 0.0); - } - - @Override - public int inverseCumulativeProbability(double p) throws OutOfRangeException { - throw new UnsupportedOperationException(); - } - - @Override - public double getNumericalMean() { - return value; - } - - @Override - public double getNumericalVariance() { - return 0; - } - - @Override - public int getSupportLowerBound() { - return value; - } - - @Override - public int getSupportUpperBound() { - return value; - } - - @Override - public boolean isSupportConnected() { - return true; - } - - @Override - public void reseedRandomGenerator(long seed) { - //no op - } - - @Override - public int sample() { - return value; - } - - @Override - public int[] sample(int sampleSize) { - int[] out = new int[sampleSize]; - Arrays.fill(out, value); - return out; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/DistributionUtils.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/DistributionUtils.java deleted file mode 100644 index 3edef5674..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/DistributionUtils.java +++ /dev/null @@ -1,151 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.distribution; - -import org.apache.commons.math3.distribution.*; - -/** - * Distribution utils for Apache Commons math distributions - which don't provide equals, hashcode, toString methods, - * don't implement serializable etc. - * Which makes unit testing etc quite difficult. - * - * @author Alex Black - */ -public class DistributionUtils { - - private DistributionUtils() {} - - - public static boolean distributionsEqual(RealDistribution a, RealDistribution b) { - if (a.getClass() != b.getClass()) - return false; - Class c = a.getClass(); - if (c == BetaDistribution.class) { - BetaDistribution ba = (BetaDistribution) a; - BetaDistribution bb = (BetaDistribution) b; - - return ba.getAlpha() == bb.getAlpha() && ba.getBeta() == bb.getBeta(); - } else if (c == CauchyDistribution.class) { - CauchyDistribution ca = (CauchyDistribution) a; - CauchyDistribution cb = (CauchyDistribution) b; - return ca.getMedian() == cb.getMedian() && ca.getScale() == cb.getScale(); - } else if (c == ChiSquaredDistribution.class) { - ChiSquaredDistribution ca = (ChiSquaredDistribution) a; - ChiSquaredDistribution cb = (ChiSquaredDistribution) b; - return ca.getDegreesOfFreedom() == cb.getDegreesOfFreedom(); - } else if (c == ExponentialDistribution.class) { - ExponentialDistribution ea = (ExponentialDistribution) a; - ExponentialDistribution eb = (ExponentialDistribution) b; - return ea.getMean() == eb.getMean(); - } else if (c == FDistribution.class) { - FDistribution fa = (FDistribution) a; - FDistribution fb = (FDistribution) b; - return fa.getNumeratorDegreesOfFreedom() == fb.getNumeratorDegreesOfFreedom() - && fa.getDenominatorDegreesOfFreedom() == fb.getDenominatorDegreesOfFreedom(); - } else if (c == GammaDistribution.class) { - GammaDistribution ga = (GammaDistribution) a; - GammaDistribution gb = (GammaDistribution) b; - return ga.getShape() == gb.getShape() && ga.getScale() == gb.getScale(); - } else if (c == LevyDistribution.class) { - LevyDistribution la = (LevyDistribution) a; - LevyDistribution lb = (LevyDistribution) b; - return la.getLocation() == lb.getLocation() && la.getScale() == lb.getScale(); - } else if (c == LogNormalDistribution.class) { - LogNormalDistribution la = (LogNormalDistribution) a; - LogNormalDistribution lb = (LogNormalDistribution) b; - return la.getScale() == lb.getScale() && la.getShape() == lb.getShape(); - } else if (c == NormalDistribution.class) { - NormalDistribution na = (NormalDistribution) a; - NormalDistribution nb = (NormalDistribution) b; - return na.getMean() == nb.getMean() && na.getStandardDeviation() == nb.getStandardDeviation(); - } else if (c == ParetoDistribution.class) { - ParetoDistribution pa = (ParetoDistribution) a; - ParetoDistribution pb = (ParetoDistribution) b; - return pa.getScale() == pb.getScale() && pa.getShape() == pb.getShape(); - } else if (c == TDistribution.class) { - TDistribution ta = (TDistribution) a; - TDistribution tb = (TDistribution) b; - return ta.getDegreesOfFreedom() == tb.getDegreesOfFreedom(); - } else if (c == TriangularDistribution.class) { - TriangularDistribution ta = (TriangularDistribution) a; - TriangularDistribution tb = (TriangularDistribution) b; - return ta.getSupportLowerBound() == tb.getSupportLowerBound() - && ta.getSupportUpperBound() == tb.getSupportUpperBound() && ta.getMode() == tb.getMode(); - } else if (c == UniformRealDistribution.class) { - UniformRealDistribution ua = (UniformRealDistribution) a; - UniformRealDistribution ub = (UniformRealDistribution) b; - return ua.getSupportLowerBound() == ub.getSupportLowerBound() - && ua.getSupportUpperBound() == ub.getSupportUpperBound(); - } else if (c == WeibullDistribution.class) { - WeibullDistribution wa = (WeibullDistribution) a; - WeibullDistribution wb = (WeibullDistribution) b; - return wa.getShape() == wb.getShape() && wa.getScale() == wb.getScale(); - } else if (c == LogUniformDistribution.class ){ - LogUniformDistribution lu_a = (LogUniformDistribution)a; - LogUniformDistribution lu_b = (LogUniformDistribution)b; - return lu_a.getMin() == lu_b.getMin() && lu_a.getMax() == lu_b.getMax(); - } else { - throw new UnsupportedOperationException("Unknown or not supported RealDistribution: " + c); - } - } - - public static boolean distributionEquals(IntegerDistribution a, IntegerDistribution b) { - if (a.getClass() != b.getClass()) - return false; - Class c = a.getClass(); - - if (c == BinomialDistribution.class) { - BinomialDistribution ba = (BinomialDistribution) a; - BinomialDistribution bb = (BinomialDistribution) b; - return ba.getNumberOfTrials() == bb.getNumberOfTrials() - && ba.getProbabilityOfSuccess() == bb.getProbabilityOfSuccess(); - } else if (c == GeometricDistribution.class) { - GeometricDistribution ga = (GeometricDistribution) a; - GeometricDistribution gb = (GeometricDistribution) b; - return ga.getProbabilityOfSuccess() == gb.getProbabilityOfSuccess(); - } else if (c == HypergeometricDistribution.class) { - HypergeometricDistribution ha = (HypergeometricDistribution) a; - HypergeometricDistribution hb = (HypergeometricDistribution) b; - return ha.getPopulationSize() == hb.getPopulationSize() - && ha.getNumberOfSuccesses() == hb.getNumberOfSuccesses() - && ha.getSampleSize() == hb.getSampleSize(); - } else if (c == PascalDistribution.class) { - PascalDistribution pa = (PascalDistribution) a; - PascalDistribution pb = (PascalDistribution) b; - return pa.getNumberOfSuccesses() == pb.getNumberOfSuccesses() - && pa.getProbabilityOfSuccess() == pb.getProbabilityOfSuccess(); - } else if (c == PoissonDistribution.class) { - PoissonDistribution pa = (PoissonDistribution) a; - PoissonDistribution pb = (PoissonDistribution) b; - return pa.getMean() == pb.getMean(); - } else if (c == UniformIntegerDistribution.class) { - UniformIntegerDistribution ua = (UniformIntegerDistribution) a; - UniformIntegerDistribution ub = (UniformIntegerDistribution) b; - return ua.getSupportUpperBound() == ub.getSupportUpperBound() - && ua.getSupportUpperBound() == ub.getSupportUpperBound(); - } else if (c == ZipfDistribution.class) { - ZipfDistribution za = (ZipfDistribution) a; - ZipfDistribution zb = (ZipfDistribution) b; - return za.getNumberOfElements() == zb.getNumberOfElements() && za.getExponent() == zb.getNumberOfElements(); - } else { - throw new UnsupportedOperationException("Unknown or not supported IntegerDistribution: " + c); - } - - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/LogUniformDistribution.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/LogUniformDistribution.java deleted file mode 100644 index 018048a48..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/distribution/LogUniformDistribution.java +++ /dev/null @@ -1,157 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.distribution; - -import org.nd4j.shade.guava.base.Preconditions; -import lombok.Getter; -import org.apache.commons.math3.distribution.RealDistribution; -import org.apache.commons.math3.exception.NumberIsTooLargeException; -import org.apache.commons.math3.exception.OutOfRangeException; - -import java.util.Random; - -/** - * Log uniform distribution, with support in range [min, max] for min > 0 - * - * Reference: https://www.vosesoftware.com/riskwiki/LogUniformdistribution.php - * - * @author Alex Black - */ -public class LogUniformDistribution implements RealDistribution { - - @Getter private final double min; - @Getter private final double max; - - private final double logMin; - private final double logMax; - - private transient Random rng = new Random(); - - /** - * - * @param min Minimum value - * @param max Maximum value - */ - public LogUniformDistribution(double min, double max) { - Preconditions.checkArgument(min > 0, "Minimum must be > 0. Got: " + min); - Preconditions.checkArgument(max > min, "Maximum must be > min. Got: (min, max)=(" - + min + "," + max + ")"); - this.min = min; - this.max = max; - - this.logMin = Math.log(min); - this.logMax = Math.log(max); - } - - @Override - public double probability(double x) { - if(x < min || x > max){ - return 0; - } - - return 1.0 / (x * (logMax - logMin)); - } - - @Override - public double density(double x) { - return probability(x); - } - - @Override - public double cumulativeProbability(double x) { - if(x <= min){ - return 0.0; - } else if(x >= max){ - return 1.0; - } - - return (Math.log(x)-logMin)/(logMax-logMin); - } - - @Override - public double cumulativeProbability(double x0, double x1) throws NumberIsTooLargeException { - return cumulativeProbability(x1) - cumulativeProbability(x0); - } - - @Override - public double inverseCumulativeProbability(double p) throws OutOfRangeException { - Preconditions.checkArgument(p >= 0 && p <= 1, "Invalid input: " + p); - return Math.exp(p * (logMax-logMin) + logMin); - } - - @Override - public double getNumericalMean() { - return (max-min)/(logMax-logMin); - } - - @Override - public double getNumericalVariance() { - double d1 = (logMax-logMin)*(max*max - min*min) - 2*(max-min)*(max-min); - return d1 / (2*Math.pow(logMax-logMin, 2.0)); - } - - @Override - public double getSupportLowerBound() { - return min; - } - - @Override - public double getSupportUpperBound() { - return max; - } - - @Override - public boolean isSupportLowerBoundInclusive() { - return true; - } - - @Override - public boolean isSupportUpperBoundInclusive() { - return true; - } - - @Override - public boolean isSupportConnected() { - return true; - } - - @Override - public void reseedRandomGenerator(long seed) { - rng.setSeed(seed); - } - - @Override - public double sample() { - return inverseCumulativeProbability(rng.nextDouble()); - } - - @Override - public double[] sample(int sampleSize) { - double[] d = new double[sampleSize]; - for( int i=0; i Type of candidates to generate - */ -@Data -@EqualsAndHashCode(exclude = {"rng", "candidateCounter"}) -public abstract class BaseCandidateGenerator implements CandidateGenerator { - protected ParameterSpace parameterSpace; - protected AtomicInteger candidateCounter = new AtomicInteger(0); - protected SynchronizedRandomGenerator rng = new SynchronizedRandomGenerator(new JDKRandomGenerator()); - protected Map dataParameters; - protected boolean initDone = false; - - public BaseCandidateGenerator(ParameterSpace parameterSpace, Map dataParameters, - boolean initDone) { - this.parameterSpace = parameterSpace; - this.dataParameters = dataParameters; - this.initDone = initDone; - } - - protected void initialize() { - if(!initDone) { - //First: collect leaf parameter spaces objects and remove duplicates - List noDuplicatesList = LeafUtils.getUniqueObjects(parameterSpace.collectLeaves()); - - //Second: assign each a number - int i = 0; - for (ParameterSpace ps : noDuplicatesList) { - int np = ps.numParameters(); - if (np == 1) { - ps.setIndices(i++); - } else { - int[] values = new int[np]; - for (int j = 0; j < np; j++) - values[j] = i++; - ps.setIndices(values); - } - } - initDone = true; - } - } - - @Override - public ParameterSpace getParameterSpace() { - return parameterSpace; - } - - @Override - public void reportResults(OptimizationResult result) { - //No op - } - - @Override - public void setRngSeed(long rngSeed) { - rng.setSeed(rngSeed); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/GeneticSearchCandidateGenerator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/GeneticSearchCandidateGenerator.java deleted file mode 100644 index 105a8cafb..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/GeneticSearchCandidateGenerator.java +++ /dev/null @@ -1,189 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator; - -import lombok.Getter; -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.ChromosomeFactory; -import org.deeplearning4j.arbiter.optimize.generator.genetic.exceptions.GeneticGenerationException; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.EmptyPopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.generator.genetic.selection.GeneticSelectionOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.selection.SelectionOperator; - -import java.util.Map; - -/** - * Uses a genetic algorithm to generate candidates. - * - * @author Alexandre Boulanger - */ -@Slf4j -public class GeneticSearchCandidateGenerator extends BaseCandidateGenerator { - - @Getter - protected final PopulationModel populationModel; - - protected final ChromosomeFactory chromosomeFactory; - protected final SelectionOperator selectionOperator; - - protected boolean hasMoreCandidates = true; - - public static class Builder { - protected final ParameterSpace parameterSpace; - - protected Map dataParameters; - protected boolean initDone; - protected boolean minimizeScore; - protected PopulationModel populationModel; - protected ChromosomeFactory chromosomeFactory; - protected SelectionOperator selectionOperator; - - /** - * @param parameterSpace ParameterSpace from which to generate candidates - * @param scoreFunction The score function that will be used in the OptimizationConfiguration - */ - public Builder(ParameterSpace parameterSpace, ScoreFunction scoreFunction) { - this.parameterSpace = parameterSpace; - this.minimizeScore = scoreFunction.minimize(); - } - - /** - * @param populationModel The PopulationModel instance to use. - */ - public Builder populationModel(PopulationModel populationModel) { - this.populationModel = populationModel; - return this; - } - - /** - * @param selectionOperator The SelectionOperator to use. Default is GeneticSelectionOperator - */ - public Builder selectionOperator(SelectionOperator selectionOperator) { - this.selectionOperator = selectionOperator; - return this; - } - - public Builder dataParameters(Map dataParameters) { - - this.dataParameters = dataParameters; - return this; - } - - public GeneticSearchCandidateGenerator.Builder initDone(boolean initDone) { - this.initDone = initDone; - return this; - } - - /** - * @param chromosomeFactory The ChromosomeFactory to use - */ - public Builder chromosomeFactory(ChromosomeFactory chromosomeFactory) { - this.chromosomeFactory = chromosomeFactory; - return this; - } - - public GeneticSearchCandidateGenerator build() { - if (populationModel == null) { - PopulationInitializer defaultPopulationInitializer = new EmptyPopulationInitializer(); - populationModel = new PopulationModel.Builder().populationInitializer(defaultPopulationInitializer) - .build(); - } - - if (chromosomeFactory == null) { - chromosomeFactory = new ChromosomeFactory(); - } - - if (selectionOperator == null) { - selectionOperator = new GeneticSelectionOperator.Builder().build(); - } - - return new GeneticSearchCandidateGenerator(this); - } - } - - private GeneticSearchCandidateGenerator(Builder builder) { - super(builder.parameterSpace, builder.dataParameters, builder.initDone); - - initialize(); - - chromosomeFactory = builder.chromosomeFactory; - populationModel = builder.populationModel; - selectionOperator = builder.selectionOperator; - - chromosomeFactory.initializeInstance(builder.parameterSpace.numParameters()); - populationModel.initializeInstance(builder.minimizeScore); - selectionOperator.initializeInstance(populationModel, chromosomeFactory); - - } - - @Override - public boolean hasMoreCandidates() { - return hasMoreCandidates; - } - - @Override - public Candidate getCandidate() { - - double[] values = null; - Object value = null; - Exception e = null; - - try { - values = selectionOperator.buildNextGenes(); - value = parameterSpace.getValue(values); - } catch (GeneticGenerationException e2) { - log.warn("Error generating candidate", e2); - e = e2; - hasMoreCandidates = false; - } catch (Exception e2) { - log.warn("Error getting configuration for candidate", e2); - e = e2; - } - - return new Candidate(value, candidateCounter.getAndIncrement(), values, dataParameters, e); - } - - @Override - public Class getCandidateType() { - return null; - } - - @Override - public String toString() { - return "GeneticSearchCandidateGenerator"; - } - - @Override - public void reportResults(OptimizationResult result) { - if (result.getScore() == null) { - return; - } - - Chromosome newChromosome = chromosomeFactory.createChromosome(result.getCandidate().getFlatParameters(), - result.getScore()); - populationModel.add(newChromosome); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/GridSearchCandidateGenerator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/GridSearchCandidateGenerator.java deleted file mode 100644 index adf04e389..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/GridSearchCandidateGenerator.java +++ /dev/null @@ -1,234 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator; - -import lombok.EqualsAndHashCode; -import lombok.Getter; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.math3.random.RandomAdaptor; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.nd4j.shade.jackson.annotation.JsonIgnoreProperties; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.util.*; -import java.util.concurrent.ConcurrentLinkedQueue; - - -/** - * GridSearchCandidateGenerator: generates candidates in an exhaustive grid search manner.
- * Note that:
- * - For discrete parameters: the grid size (# values to check per hyperparameter) is equal to the number of values for - * that hyperparameter
- * - For integer parameters: the grid size is equal to {@code min(discretizationCount,max-min+1)}. Some integer ranges can - * be large, and we don't necessarily want to exhaustively search them. {@code discretizationCount} is a constructor argument
- * - For continuous parameters: the grid size is equal to {@code discretizationCount}.
- * In all cases, the minimum, maximum and gridSize-2 values between the min/max will be generated.
- * Also note that: if a probability distribution is provided for continuous hyperparameters, this will be taken into account - * when generating candidates. This allows the grid for a hyperparameter to be non-linear: i.e., for example, linear in log space - * - * @author Alex Black - */ -@Slf4j -@EqualsAndHashCode(exclude = {"order"}, callSuper = true) -@JsonIgnoreProperties({"numValuesPerParam", "totalNumCandidates", "order", "candidateCounter", "rng", "candidate"}) -public class GridSearchCandidateGenerator extends BaseCandidateGenerator { - - /** - * In what order should candidates be generated?
- * Sequential: generate candidates in order. The first hyperparameter will be changed most rapidly, and the last - * will be changed least rapidly.
- * RandomOrder: generate candidates in a random order
- * In both cases, the same candidates will be generated; only the order of generation is different - */ - public enum Mode { - Sequential, RandomOrder - } - - private final int discretizationCount; - private final Mode mode; - - private int[] numValuesPerParam; - @Getter - private int totalNumCandidates; - private Queue order; - - /** - * @param parameterSpace ParameterSpace from which to generate candidates - * @param discretizationCount For continuous parameters: into how many values should we discretize them into? - * For example, suppose continuous parameter is in range [0,1] with 3 bins: - * do [0.0, 0.5, 1.0]. Note that if all values - * @param mode {@link GridSearchCandidateGenerator.Mode} specifies the order - * in which candidates should be generated. - */ - public GridSearchCandidateGenerator(@JsonProperty("parameterSpace") ParameterSpace parameterSpace, - @JsonProperty("discretizationCount") int discretizationCount, @JsonProperty("mode") Mode mode, - @JsonProperty("dataParameters") Map dataParameters, - @JsonProperty("initDone") boolean initDone) { - super(parameterSpace, dataParameters, initDone); - this.discretizationCount = discretizationCount; - this.mode = mode; - initialize(); - } - - /** - * @param parameterSpace ParameterSpace from which to generate candidates - * @param discretizationCount For continuous parameters: into how many values should we discretize them into? - * For example, suppose continuous parameter is in range [0,1] with 3 bins: - * do [0.0, 0.5, 1.0]. Note that if all values - * @param mode {@link GridSearchCandidateGenerator.Mode} specifies the order - * in which candidates should be generated. - */ - public GridSearchCandidateGenerator(ParameterSpace parameterSpace, int discretizationCount, Mode mode, - Map dataParameters){ - this(parameterSpace, discretizationCount, mode, dataParameters, false); - } - - @Override - protected void initialize() { - super.initialize(); - - List leaves = LeafUtils.getUniqueObjects(parameterSpace.collectLeaves()); - int nParams = leaves.size(); - - //Work out for each parameter: is it continuous or discrete? - // for grid search: discrete values are grid-searchable as-is - // continuous values: discretize using 'discretizationCount' bins - // integer values: use min(max-min+1, discretizationCount) values. i.e., discretize if necessary - numValuesPerParam = new int[nParams]; - long searchSize = 1; - for (int i = 0; i < nParams; i++) { - ParameterSpace ps = leaves.get(i); - if (ps instanceof DiscreteParameterSpace) { - DiscreteParameterSpace dps = (DiscreteParameterSpace) ps; - numValuesPerParam[i] = dps.numValues(); - } else if (ps instanceof IntegerParameterSpace) { - IntegerParameterSpace ips = (IntegerParameterSpace) ps; - int min = ips.getMin(); - int max = ips.getMax(); - //Discretize, as some integer ranges are much too large to search (i.e., num. neural network units, between 100 and 1000) - numValuesPerParam[i] = Math.min(max - min + 1, discretizationCount); - } else if (ps instanceof FixedValue){ - numValuesPerParam[i] = 1; - } else { - numValuesPerParam[i] = discretizationCount; - } - searchSize *= numValuesPerParam[i]; - } - - if (searchSize >= Integer.MAX_VALUE) - throw new IllegalStateException("Invalid search: cannot process search with " + searchSize - + " candidates > Integer.MAX_VALUE"); //TODO find a more reasonable upper bound? - - order = new ConcurrentLinkedQueue<>(); - - totalNumCandidates = (int) searchSize; - switch (mode) { - case Sequential: - for (int i = 0; i < totalNumCandidates; i++) { - order.add(i); - } - break; - case RandomOrder: - List tempList = new ArrayList<>(totalNumCandidates); - for (int i = 0; i < totalNumCandidates; i++) { - tempList.add(i); - } - - Collections.shuffle(tempList, new RandomAdaptor(rng)); - order.addAll(tempList); - break; - default: - throw new RuntimeException(); - } - - } - - @Override - public boolean hasMoreCandidates() { - return !order.isEmpty(); - } - - @Override - public Candidate getCandidate() { - int next = order.remove(); - - //Next: max integer (candidate number) to values - double[] values = indexToValues(numValuesPerParam, next, totalNumCandidates); - - Object value = null; - Exception e = null; - try { - value = parameterSpace.getValue(values); - } catch (Exception e2) { - log.warn("Error getting configuration for candidate", e2); - e = e2; - } - - return new Candidate(value, candidateCounter.getAndIncrement(), values, dataParameters, e); - } - - @Override - public Class getCandidateType() { - return null; - } - - public static double[] indexToValues(int[] numValuesPerParam, int candidateIdx, int product) { - //How? first map to index of num possible values. Then: to double values in range 0 to 1 - // 0-> [0,0,0], 1-> [1,0,0], 2-> [2,0,0], 3-> [0,1,0] etc - //Based on: Nd4j Shape.ind2sub - - int countNon1 = 0; - for( int i : numValuesPerParam) - if(i > 1) - countNon1++; - - int denom = product; - int num = candidateIdx; - int[] index = new int[numValuesPerParam.length]; - - for (int i = index.length - 1; i >= 0; i--) { - denom /= numValuesPerParam[i]; - index[i] = num / denom; - num %= denom; - } - - //Now: convert indexes to values in range [0,1] - //min value -> 0 - //max value -> 1 - double[] out = new double[countNon1]; - int outIdx = 0; - for (int i = 0; i < numValuesPerParam.length; i++) { - if (numValuesPerParam[i] > 1){ - out[outIdx++] = index[i] / ((double) (numValuesPerParam[i] - 1)); - } - } - - return out; - } - - @Override - public String toString() { - return "GridSearchCandidateGenerator(mode=" + mode + ")"; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/RandomSearchGenerator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/RandomSearchGenerator.java deleted file mode 100644 index a6774e69e..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/RandomSearchGenerator.java +++ /dev/null @@ -1,95 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator; - -import lombok.EqualsAndHashCode; -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.nd4j.shade.jackson.annotation.JsonCreator; -import org.nd4j.shade.jackson.annotation.JsonIgnoreProperties; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.util.Map; - -/** - * RandomSearchGenerator: generates candidates at random.
- * Note: if a probability distribution is provided for continuous hyperparameters, - * this will be taken into account - * when generating candidates. This allows the search to be weighted more towards - * certain values according to a probability - * density. For example: generate samples for learning rate according to log uniform distribution - * - * @author Alex Black - */ -@Slf4j -@EqualsAndHashCode(callSuper = true) -@JsonIgnoreProperties({"numValuesPerParam", "totalNumCandidates", "order", "candidateCounter", "rng", "candidate"}) -public class RandomSearchGenerator extends BaseCandidateGenerator { - - @JsonCreator - public RandomSearchGenerator(@JsonProperty("parameterSpace") ParameterSpace parameterSpace, - @JsonProperty("dataParameters") Map dataParameters, - @JsonProperty("initDone") boolean initDone) { - super(parameterSpace, dataParameters, initDone); - initialize(); - } - - public RandomSearchGenerator(ParameterSpace parameterSpace, Map dataParameters){ - this(parameterSpace, dataParameters, false); - } - - public RandomSearchGenerator(ParameterSpace parameterSpace){ - this(parameterSpace, null, false); - } - - - @Override - public boolean hasMoreCandidates() { - return true; - } - - @Override - public Candidate getCandidate() { - double[] randomValues = new double[parameterSpace.numParameters()]; - for (int i = 0; i < randomValues.length; i++) - randomValues[i] = rng.nextDouble(); - - Object value = null; - Exception e = null; - try { - value = parameterSpace.getValue(randomValues); - } catch (Exception e2) { - log.warn("Error getting configuration for candidate", e2); - e = e2; - } - - return new Candidate(value, candidateCounter.getAndIncrement(), randomValues, dataParameters, e); - } - - @Override - public Class getCandidateType() { - return null; - } - - @Override - public String toString() { - return "RandomSearchGenerator"; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/Chromosome.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/Chromosome.java deleted file mode 100644 index d41bc9b2c..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/Chromosome.java +++ /dev/null @@ -1,44 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic; - -import lombok.Data; - -/** - * Candidates are stored as Chromosome in the population model - * - * @author Alexandre Boulanger - */ -@Data -public class Chromosome { - /** - * The fitness score of the genes. - */ - protected final double fitness; - - /** - * The genes. - */ - protected final double[] genes; - - public Chromosome(double[] genes, double fitness) { - this.genes = genes; - this.fitness = fitness; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/ChromosomeFactory.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/ChromosomeFactory.java deleted file mode 100644 index b9b170b61..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/ChromosomeFactory.java +++ /dev/null @@ -1,53 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic; - -/** - * A factory that builds new chromosomes. Used by the GeneticSearchCandidateGenerator. - * - * @author Alexandre Boulanger - */ -public class ChromosomeFactory { - private int chromosomeLength; - - /** - * Called by the GeneticSearchCandidateGenerator. - */ - public void initializeInstance(int chromosomeLength) { - this.chromosomeLength = chromosomeLength; - } - - /** - * Create a new instance of a Chromosome - * - * @param genes The genes - * @param fitness The fitness score - * @return A new instance of Chromosome - */ - public Chromosome createChromosome(double[] genes, double fitness) { - return new Chromosome(genes, fitness); - } - - /** - * @return The number of genes in a chromosome - */ - public int getChromosomeLength() { - return chromosomeLength; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/ArithmeticCrossover.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/ArithmeticCrossover.java deleted file mode 100644 index 509f69465..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/ArithmeticCrossover.java +++ /dev/null @@ -1,122 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover; - -import org.apache.commons.math3.random.JDKRandomGenerator; -import org.apache.commons.math3.random.RandomGenerator; -import org.apache.commons.math3.random.SynchronizedRandomGenerator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.RandomTwoParentSelection; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.TwoParentSelection; -import org.nd4j.common.base.Preconditions; - -/** - * A crossover operator that linearly combines the genes of two parents.
- * When a crossover is generated (with a of probability crossover rate), each genes is a linear combination of the corresponding genes of the parents. - *

- * t*parentA + (1-t)*parentB, where t is [0, 1] and different for each gene. - * - * @author Alexandre Boulanger - */ -public class ArithmeticCrossover extends TwoParentsCrossoverOperator { - private static final double DEFAULT_CROSSOVER_RATE = 0.85; - - private final double crossoverRate; - private final RandomGenerator rng; - - public static class Builder { - private double crossoverRate = DEFAULT_CROSSOVER_RATE; - private RandomGenerator rng; - private TwoParentSelection parentSelection; - - /** - * The probability that the operator generates a crossover (default 0.85). - * - * @param rate A value between 0.0 and 1.0 - */ - public Builder crossoverRate(double rate) { - Preconditions.checkState(rate >= 0.0 && rate <= 1.0, "Rate must be between 0.0 and 1.0, got %s", rate); - - this.crossoverRate = rate; - return this; - } - - /** - * Use a supplied RandomGenerator - * - * @param rng An instance of RandomGenerator - */ - public Builder randomGenerator(RandomGenerator rng) { - this.rng = rng; - return this; - } - - /** - * The parent selection behavior. Default is random parent selection. - * - * @param parentSelection An instance of TwoParentSelection - */ - public Builder parentSelection(TwoParentSelection parentSelection) { - this.parentSelection = parentSelection; - return this; - } - - public ArithmeticCrossover build() { - if (rng == null) { - rng = new SynchronizedRandomGenerator(new JDKRandomGenerator()); - } - - if (parentSelection == null) { - parentSelection = new RandomTwoParentSelection(); - } - - return new ArithmeticCrossover(this); - } - } - - private ArithmeticCrossover(ArithmeticCrossover.Builder builder) { - super(builder.parentSelection); - - this.crossoverRate = builder.crossoverRate; - this.rng = builder.rng; - } - - /** - * Has a probability crossoverRate of performing the crossover where each gene is a linear combination of:
- * t*parentA + (1-t)*parentB, where t is [0, 1] and different for each gene.
- * Otherwise, returns the genes of a random parent. - * - * @return The crossover result. See {@link CrossoverResult}. - */ - @Override - public CrossoverResult crossover() { - double[][] parents = parentSelection.selectParents(); - - double[] offspringValues = new double[parents[0].length]; - - if (rng.nextDouble() < crossoverRate) { - for (int i = 0; i < offspringValues.length; ++i) { - double t = rng.nextDouble(); - offspringValues[i] = t * parents[0][i] + (1.0 - t) * parents[1][i]; - } - return new CrossoverResult(true, offspringValues); - } - - return new CrossoverResult(false, parents[0]); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/CrossoverOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/CrossoverOperator.java deleted file mode 100644 index 9b9e89ba5..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/CrossoverOperator.java +++ /dev/null @@ -1,47 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; - -/** - * Abstract class for all crossover operators - * - * @author Alexandre Boulanger - */ -public abstract class CrossoverOperator { - protected PopulationModel populationModel; - - /** - * Will be called by the selection operator once the population model is instantiated. - */ - public void initializeInstance(PopulationModel populationModel) { - this.populationModel = populationModel; - } - - /** - * Performs the crossover - * - * @return The crossover result. See {@link CrossoverResult}. - */ - public abstract CrossoverResult crossover(); - - - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/CrossoverResult.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/CrossoverResult.java deleted file mode 100644 index 335841ae2..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/CrossoverResult.java +++ /dev/null @@ -1,45 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover; - -import lombok.Data; - -/** - * Returned by a crossover operator - * - * @author Alexandre Boulanger - */ -@Data -public class CrossoverResult { - /** - * If false, there was no crossover and the operator simply returned the genes of a random parent. - * If true, the genes are the result of a crossover. - */ - private final boolean isModified; - - /** - * The genes returned by the operator. - */ - private final double[] genes; - - public CrossoverResult(boolean isModified, double[] genes) { - this.isModified = isModified; - this.genes = genes; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/KPointCrossover.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/KPointCrossover.java deleted file mode 100644 index 4c6ed28fe..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/KPointCrossover.java +++ /dev/null @@ -1,180 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover; - -import org.apache.commons.math3.random.JDKRandomGenerator; -import org.apache.commons.math3.random.RandomGenerator; -import org.apache.commons.math3.random.SynchronizedRandomGenerator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.RandomTwoParentSelection; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.TwoParentSelection; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.utils.CrossoverPointsGenerator; -import org.nd4j.common.base.Preconditions; - -import java.util.Deque; - -/** -* The K-Point crossover will select at random multiple crossover points.
-* Each gene comes from one of the two parents. Each time a crossover point is reached, the parent is switched. -*/ -public class KPointCrossover extends TwoParentsCrossoverOperator { - private static final double DEFAULT_CROSSOVER_RATE = 0.85; - private static final int DEFAULT_MIN_CROSSOVER = 1; - private static final int DEFAULT_MAX_CROSSOVER = 4; - - private final double crossoverRate; - private final int minCrossovers; - private final int maxCrossovers; - - private final RandomGenerator rng; - - public static class Builder { - private double crossoverRate = DEFAULT_CROSSOVER_RATE; - private int minCrossovers = DEFAULT_MIN_CROSSOVER; - private int maxCrossovers = DEFAULT_MAX_CROSSOVER; - private RandomGenerator rng; - private TwoParentSelection parentSelection; - - /** - * The probability that the operator generates a crossover (default 0.85). - * - * @param rate A value between 0.0 and 1.0 - */ - public Builder crossoverRate(double rate) { - Preconditions.checkState(rate >= 0.0 && rate <= 1.0, "Rate must be between 0.0 and 1.0, got %s", rate); - - this.crossoverRate = rate; - return this; - } - - /** - * The number of crossovers points (default is min 1, max 4) - * - * @param min The minimum number - * @param max The maximum number - */ - public Builder numCrossovers(int min, int max) { - Preconditions.checkState(max >= 0 && min >= 0, "Min and max must be positive"); - Preconditions.checkState(max >= min, "Max must be greater or equal to min"); - - this.minCrossovers = min; - this.maxCrossovers = max; - return this; - } - - /** - * Use a fixed number of crossover points - * - * @param num The number of crossovers - */ - public Builder numCrossovers(int num) { - Preconditions.checkState(num >= 0, "Num must be positive"); - - this.minCrossovers = num; - this.maxCrossovers = num; - return this; - } - - /** - * Use a supplied RandomGenerator - * - * @param rng An instance of RandomGenerator - */ - public Builder randomGenerator(RandomGenerator rng) { - this.rng = rng; - return this; - } - - /** - * The parent selection behavior. Default is random parent selection. - * - * @param parentSelection An instance of TwoParentSelection - */ - public Builder parentSelection(TwoParentSelection parentSelection) { - this.parentSelection = parentSelection; - return this; - } - - public KPointCrossover build() { - if (rng == null) { - rng = new SynchronizedRandomGenerator(new JDKRandomGenerator()); - } - - if (parentSelection == null) { - parentSelection = new RandomTwoParentSelection(); - } - - return new KPointCrossover(this); - } - } - - private CrossoverPointsGenerator crossoverPointsGenerator; - - private KPointCrossover(KPointCrossover.Builder builder) { - super(builder.parentSelection); - - this.crossoverRate = builder.crossoverRate; - this.maxCrossovers = builder.maxCrossovers; - this.minCrossovers = builder.minCrossovers; - this.rng = builder.rng; - } - - private CrossoverPointsGenerator getCrossoverPointsGenerator(int chromosomeLength) { - if (crossoverPointsGenerator == null) { - crossoverPointsGenerator = - new CrossoverPointsGenerator(chromosomeLength, minCrossovers, maxCrossovers, rng); - } - - return crossoverPointsGenerator; - } - - /** - * Has a probability crossoverRate of performing the crossover where the operator will select at random multiple crossover points.
- * Each gene comes from one of the two parents. Each time a crossover point is reached, the parent is switched.
- * Otherwise, returns the genes of a random parent. - * - * @return The crossover result. See {@link CrossoverResult}. - */ - @Override - public CrossoverResult crossover() { - double[][] parents = parentSelection.selectParents(); - - boolean isModified = false; - double[] resultGenes = parents[0]; - - if (rng.nextDouble() < crossoverRate) { - // Select crossover points - Deque crossoverPoints = getCrossoverPointsGenerator(parents[0].length).getCrossoverPoints(); - - // Crossover - resultGenes = new double[parents[0].length]; - int currentParent = 0; - int nextCrossover = crossoverPoints.pop(); - for (int i = 0; i < resultGenes.length; ++i) { - if (i == nextCrossover) { - currentParent = currentParent == 0 ? 1 : 0; - nextCrossover = crossoverPoints.pop(); - } - resultGenes[i] = parents[currentParent][i]; - } - isModified = true; - } - - return new CrossoverResult(isModified, resultGenes); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/SinglePointCrossover.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/SinglePointCrossover.java deleted file mode 100644 index 65f63deb2..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/SinglePointCrossover.java +++ /dev/null @@ -1,125 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover; - -import org.apache.commons.math3.random.JDKRandomGenerator; -import org.apache.commons.math3.random.RandomGenerator; -import org.apache.commons.math3.random.SynchronizedRandomGenerator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.RandomTwoParentSelection; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.TwoParentSelection; -import org.nd4j.common.base.Preconditions; - -/** - * The single point crossover will select a random point where every genes before that point comes from one parent - * and after which every genes comes from the other parent. - * - * @author Alexandre Boulanger - */ -public class SinglePointCrossover extends TwoParentsCrossoverOperator { - private static final double DEFAULT_CROSSOVER_RATE = 0.85; - - private final RandomGenerator rng; - private final double crossoverRate; - - public static class Builder { - private double crossoverRate = DEFAULT_CROSSOVER_RATE; - private RandomGenerator rng; - private TwoParentSelection parentSelection; - - /** - * The probability that the operator generates a crossover (default 0.85). - * - * @param rate A value between 0.0 and 1.0 - */ - public Builder crossoverRate(double rate) { - Preconditions.checkState(rate >= 0.0 && rate <= 1.0, "Rate must be between 0.0 and 1.0, got %s", rate); - - this.crossoverRate = rate; - return this; - } - - /** - * Use a supplied RandomGenerator - * - * @param rng An instance of RandomGenerator - */ - public Builder randomGenerator(RandomGenerator rng) { - this.rng = rng; - return this; - } - - /** - * The parent selection behavior. Default is random parent selection. - * - * @param parentSelection An instance of TwoParentSelection - */ - public Builder parentSelection(TwoParentSelection parentSelection) { - this.parentSelection = parentSelection; - return this; - } - - public SinglePointCrossover build() { - if (rng == null) { - rng = new SynchronizedRandomGenerator(new JDKRandomGenerator()); - } - - if (parentSelection == null) { - parentSelection = new RandomTwoParentSelection(); - } - - return new SinglePointCrossover(this); - } - } - - private SinglePointCrossover(SinglePointCrossover.Builder builder) { - super(builder.parentSelection); - - this.crossoverRate = builder.crossoverRate; - this.rng = builder.rng; - } - - /** - * Has a probability crossoverRate of performing the crossover where the operator will select a random crossover point.
- * Each gene before this point comes from one of the two parents and each gene at or after this point comes from the other parent. - * Otherwise, returns the genes of a random parent. - * - * @return The crossover result. See {@link CrossoverResult}. - */ - public CrossoverResult crossover() { - double[][] parents = parentSelection.selectParents(); - - boolean isModified = false; - double[] resultGenes = parents[0]; - - if (rng.nextDouble() < crossoverRate) { - int chromosomeLength = parents[0].length; - - // Crossover - resultGenes = new double[chromosomeLength]; - - int crossoverPoint = rng.nextInt(chromosomeLength); - for (int i = 0; i < resultGenes.length; ++i) { - resultGenes[i] = ((i < crossoverPoint) ? parents[0] : parents[1])[i]; - } - isModified = true; - } - - return new CrossoverResult(isModified, resultGenes); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/TwoParentsCrossoverOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/TwoParentsCrossoverOperator.java deleted file mode 100644 index 070bb06ad..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/TwoParentsCrossoverOperator.java +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.TwoParentSelection; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; - -/** - * Abstract class for all crossover operators that applies to two parents. - * - * @author Alexandre Boulanger - */ -public abstract class TwoParentsCrossoverOperator extends CrossoverOperator { - - protected final TwoParentSelection parentSelection; - - /** - * @param parentSelection A parent selection that selects two parents. - */ - protected TwoParentsCrossoverOperator(TwoParentSelection parentSelection) { - this.parentSelection = parentSelection; - } - - /** - * Will be called by the selection operator once the population model is instantiated. - */ - @Override - public void initializeInstance(PopulationModel populationModel) { - super.initializeInstance(populationModel); - parentSelection.initializeInstance(populationModel.getPopulation()); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/UniformCrossover.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/UniformCrossover.java deleted file mode 100644 index 3831310b9..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/UniformCrossover.java +++ /dev/null @@ -1,138 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover; - -import org.apache.commons.math3.random.JDKRandomGenerator; -import org.apache.commons.math3.random.RandomGenerator; -import org.apache.commons.math3.random.SynchronizedRandomGenerator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.RandomTwoParentSelection; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.TwoParentSelection; -import org.nd4j.common.base.Preconditions; - -/** - * The uniform crossover will, for each gene, randomly select the parent that donates the gene. - * - * @author Alexandre Boulanger - */ -public class UniformCrossover extends TwoParentsCrossoverOperator { - private static final double DEFAULT_CROSSOVER_RATE = 0.85; - private static final double DEFAULT_PARENT_BIAS_FACTOR = 0.5; - - private final double crossoverRate; - private final double parentBiasFactor; - private final RandomGenerator rng; - - public static class Builder { - private double crossoverRate = DEFAULT_CROSSOVER_RATE; - private double parentBiasFactor = DEFAULT_PARENT_BIAS_FACTOR; - private RandomGenerator rng; - private TwoParentSelection parentSelection; - - /** - * The probability that the operator generates a crossover (default 0.85). - * - * @param rate A value between 0.0 and 1.0 - */ - public Builder crossoverRate(double rate) { - Preconditions.checkState(rate >= 0.0 && rate <= 1.0, "Rate must be between 0.0 and 1.0, got %s", rate); - - this.crossoverRate = rate; - return this; - } - - /** - * A factor that will introduce a bias in the parent selection.
- * - * @param factor In the range [0, 1]. 0 will only select the first parent while 1 only select the second one. The default is 0.5; no bias. - */ - public Builder parentBiasFactor(double factor) { - Preconditions.checkState(factor >= 0.0 && factor <= 1.0, "Factor must be between 0.0 and 1.0, got %s", - factor); - - this.parentBiasFactor = factor; - return this; - } - - /** - * Use a supplied RandomGenerator - * - * @param rng An instance of RandomGenerator - */ - public Builder randomGenerator(RandomGenerator rng) { - this.rng = rng; - return this; - } - - /** - * The parent selection behavior. Default is random parent selection. - * - * @param parentSelection An instance of TwoParentSelection - */ - public Builder parentSelection(TwoParentSelection parentSelection) { - this.parentSelection = parentSelection; - return this; - } - - public UniformCrossover build() { - if (rng == null) { - rng = new SynchronizedRandomGenerator(new JDKRandomGenerator()); - } - if (parentSelection == null) { - parentSelection = new RandomTwoParentSelection(); - } - return new UniformCrossover(this); - } - } - - private UniformCrossover(UniformCrossover.Builder builder) { - super(builder.parentSelection); - - this.crossoverRate = builder.crossoverRate; - this.parentBiasFactor = builder.parentBiasFactor; - this.rng = builder.rng; - } - - /** - * Has a probability crossoverRate of performing the crossover where the operator will select randomly which parent donates the gene.
- * One of the parent may be favored if the bias is different than 0.5 - * Otherwise, returns the genes of a random parent. - * - * @return The crossover result. See {@link CrossoverResult}. - */ - @Override - public CrossoverResult crossover() { - // select the parents - double[][] parents = parentSelection.selectParents(); - - double[] resultGenes = parents[0]; - boolean isModified = false; - - if (rng.nextDouble() < crossoverRate) { - // Crossover - resultGenes = new double[parents[0].length]; - - for (int i = 0; i < resultGenes.length; ++i) { - resultGenes[i] = ((rng.nextDouble() < parentBiasFactor) ? parents[0] : parents[1])[i]; - } - isModified = true; - } - - return new CrossoverResult(isModified, resultGenes); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/ParentSelection.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/ParentSelection.java deleted file mode 100644 index 179de0184..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/ParentSelection.java +++ /dev/null @@ -1,46 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; - -import java.util.List; - -/** - * Abstract class for all parent selection behaviors - * - * @author Alexandre Boulanger - */ -public abstract class ParentSelection { - protected List population; - - /** - * Will be called by the crossover operator once the population model is instantiated. - */ - public void initializeInstance(List population) { - this.population = population; - } - - /** - * Performs the parent selection - * - * @return An array of parents genes. The outer array are the parents, and the inner array are the genes. - */ - public abstract double[][] selectParents(); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/RandomTwoParentSelection.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/RandomTwoParentSelection.java deleted file mode 100644 index 465d2de4e..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/RandomTwoParentSelection.java +++ /dev/null @@ -1,67 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection; - -import org.apache.commons.math3.random.JDKRandomGenerator; -import org.apache.commons.math3.random.RandomGenerator; -import org.apache.commons.math3.random.SynchronizedRandomGenerator; - -/** - * A parent selection behavior that returns two random parents. - * - * @author Alexandre Boulanger - */ -public class RandomTwoParentSelection extends TwoParentSelection { - - private final RandomGenerator rng; - - public RandomTwoParentSelection() { - this(new SynchronizedRandomGenerator(new JDKRandomGenerator())); - } - - /** - * Use a supplied RandomGenerator - * - * @param rng An instance of RandomGenerator - */ - public RandomTwoParentSelection(RandomGenerator rng) { - this.rng = rng; - } - - /** - * Selects two random parents - * - * @return An array of parents genes. The outer array are the parents, and the inner array are the genes. - */ - @Override - public double[][] selectParents() { - double[][] parents = new double[2][]; - - int parent1Idx = rng.nextInt(population.size()); - int parent2Idx; - do { - parent2Idx = rng.nextInt(population.size()); - } while (parent1Idx == parent2Idx); - - parents[0] = population.get(parent1Idx).getGenes(); - parents[1] = population.get(parent2Idx).getGenes(); - - return parents; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/TwoParentSelection.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/TwoParentSelection.java deleted file mode 100644 index 0f274ec48..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/parentselection/TwoParentSelection.java +++ /dev/null @@ -1,27 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection; - -/** - * Abstract class for all parent selection behaviors that selects two parents. - * - * @author Alexandre Boulanger - */ -public abstract class TwoParentSelection extends ParentSelection { -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/utils/CrossoverPointsGenerator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/utils/CrossoverPointsGenerator.java deleted file mode 100644 index 885d9281b..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/crossover/utils/CrossoverPointsGenerator.java +++ /dev/null @@ -1,70 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.utils; - -import org.apache.commons.math3.random.RandomGenerator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.KPointCrossover; - -import java.util.*; - -/** - * A helper class used by {@link KPointCrossover} to generate the crossover points - * - * @author Alexandre Boulanger - */ -public class CrossoverPointsGenerator { - private final int minCrossovers; - private final int maxCrossovers; - private final RandomGenerator rng; - private List parameterIndexes; - - /** - * Constructor - * - * @param chromosomeLength The number of genes - * @param minCrossovers The minimum number of crossover points to generate - * @param maxCrossovers The maximum number of crossover points to generate - * @param rng A RandomGenerator instance - */ - public CrossoverPointsGenerator(int chromosomeLength, int minCrossovers, int maxCrossovers, RandomGenerator rng) { - this.minCrossovers = minCrossovers; - this.maxCrossovers = maxCrossovers; - this.rng = rng; - parameterIndexes = new ArrayList(); - for (int i = 0; i < chromosomeLength; ++i) { - parameterIndexes.add(i); - } - } - - /** - * Generate a list of crossover points. - * - * @return An ordered list of crossover point indexes and with Integer.MAX_VALUE as the last element - */ - public Deque getCrossoverPoints() { - Collections.shuffle(parameterIndexes); - List crossoverPointLists = - parameterIndexes.subList(0, rng.nextInt(maxCrossovers - minCrossovers) + minCrossovers); - Collections.sort(crossoverPointLists); - Deque crossoverPoints = new ArrayDeque(crossoverPointLists); - crossoverPoints.add(Integer.MAX_VALUE); - - return crossoverPoints; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/CullOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/CullOperator.java deleted file mode 100644 index e18dd077b..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/CullOperator.java +++ /dev/null @@ -1,43 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.culling; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; - -/** - * The cull operator will remove from the population the least desirables chromosomes. - * - * @author Alexandre Boulanger - */ -public interface CullOperator { - /** - * Will be called by the population model once created. - */ - void initializeInstance(PopulationModel populationModel); - - /** - * Cull the population to the culled size. - */ - void cullPopulation(); - - /** - * @return The target population size after culling. - */ - int getCulledSize(); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/LeastFitCullOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/LeastFitCullOperator.java deleted file mode 100644 index b8474eb18..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/LeastFitCullOperator.java +++ /dev/null @@ -1,52 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.culling; - -/** - * An elitist cull operator that discards the chromosomes with the worst fitness while keeping the best ones. - * - * @author Alexandre Boulanger - */ -public class LeastFitCullOperator extends RatioCullOperator { - - /** - * The default cull ratio is 1/3. - */ - public LeastFitCullOperator() { - super(); - } - - /** - * @param cullRatio The ratio of the maximum population size to be culled.
- * For example, a ratio of 1/3 on a population with a maximum size of 30 will cull back a given population to 20. - */ - public LeastFitCullOperator(double cullRatio) { - super(cullRatio); - } - - /** - * Will discard the chromosomes with the worst fitness until the population size fall back at the culled size. - */ - @Override - public void cullPopulation() { - while (population.size() > culledSize) { - population.remove(population.size() - 1); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/RatioCullOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/RatioCullOperator.java deleted file mode 100644 index ac8722c6b..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/culling/RatioCullOperator.java +++ /dev/null @@ -1,72 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.culling; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.nd4j.common.base.Preconditions; - -import java.util.List; - -/** - * An abstract base for cull operators that culls back the population to a ratio of its maximum size. - * - * @author Alexandre Boulanger - */ -public abstract class RatioCullOperator implements CullOperator { - private static final double DEFAULT_CULL_RATIO = 1.0 / 3.0; - protected int culledSize; - protected List population; - protected final double cullRatio; - - /** - * @param cullRatio The ratio of the maximum population size to be culled.
- * For example, a ratio of 1/3 on a population with a maximum size of 30 will cull back a given population to 20. - */ - public RatioCullOperator(double cullRatio) { - Preconditions.checkState(cullRatio >= 0.0 && cullRatio <= 1.0, "Cull ratio must be between 0.0 and 1.0, got %s", - cullRatio); - - this.cullRatio = cullRatio; - } - - /** - * The default cull ratio is 1/3 - */ - public RatioCullOperator() { - this(DEFAULT_CULL_RATIO); - } - - /** - * Will be called by the population model once created. - */ - public void initializeInstance(PopulationModel populationModel) { - this.population = populationModel.getPopulation(); - culledSize = (int) (populationModel.getPopulationSize() * (1.0 - cullRatio) + 0.5); - } - - /** - * @return The target population size after culling. - */ - @Override - public int getCulledSize() { - return culledSize; - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/exceptions/GeneticGenerationException.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/exceptions/GeneticGenerationException.java deleted file mode 100644 index 681473930..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/exceptions/GeneticGenerationException.java +++ /dev/null @@ -1,25 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.exceptions; - -public class GeneticGenerationException extends RuntimeException { - public GeneticGenerationException(String message) { - super(message); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/mutation/MutationOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/mutation/MutationOperator.java deleted file mode 100644 index 9b298cb4b..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/mutation/MutationOperator.java +++ /dev/null @@ -1,35 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.mutation; - -/** - * The mutation operator will apply a mutation to the given genes. - * - * @author Alexandre Boulanger - */ -public interface MutationOperator { - - /** - * Performs a mutation. - * - * @param genes The genes to be mutated - * @return True if the genes were mutated, otherwise false. - */ - boolean mutate(double[] genes); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/mutation/RandomMutationOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/mutation/RandomMutationOperator.java deleted file mode 100644 index 69bdd791b..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/mutation/RandomMutationOperator.java +++ /dev/null @@ -1,95 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.mutation; - -import org.apache.commons.math3.random.JDKRandomGenerator; -import org.apache.commons.math3.random.RandomGenerator; -import org.apache.commons.math3.random.SynchronizedRandomGenerator; -import org.nd4j.common.base.Preconditions; - -/** - * A mutation operator where each gene has a chance of being mutated with a mutation rate probability. - * - * @author Alexandre Boulanger - */ -public class RandomMutationOperator implements MutationOperator { - private static final double DEFAULT_MUTATION_RATE = 0.005; - - private final double mutationRate; - private final RandomGenerator rng; - - public static class Builder { - private double mutationRate = DEFAULT_MUTATION_RATE; - private RandomGenerator rng; - - /** - * Each gene will have this probability of being mutated. - * - * @param rate The mutation rate. (default 0.005) - */ - public Builder mutationRate(double rate) { - Preconditions.checkState(rate >= 0.0 && rate <= 1.0, "Rate must be between 0.0 and 1.0, got %s", rate); - - this.mutationRate = rate; - return this; - } - - /** - * Use a supplied RandomGenerator - * - * @param rng An instance of RandomGenerator - */ - public Builder randomGenerator(RandomGenerator rng) { - this.rng = rng; - return this; - } - - public RandomMutationOperator build() { - if (rng == null) { - rng = new SynchronizedRandomGenerator(new JDKRandomGenerator()); - } - return new RandomMutationOperator(this); - } - } - - private RandomMutationOperator(RandomMutationOperator.Builder builder) { - this.mutationRate = builder.mutationRate; - this.rng = builder.rng; - } - - /** - * Performs the mutation. Each gene has a mutation rate probability of being mutated. - * - * @param genes The genes to be mutated - * @return True if the genes were mutated, otherwise false. - */ - @Override - public boolean mutate(double[] genes) { - boolean hasMutation = false; - - for (int i = 0; i < genes.length; ++i) { - if (rng.nextDouble() < mutationRate) { - genes[i] = rng.nextDouble(); - hasMutation = true; - } - } - - return hasMutation; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/EmptyPopulationInitializer.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/EmptyPopulationInitializer.java deleted file mode 100644 index 363c34c17..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/EmptyPopulationInitializer.java +++ /dev/null @@ -1,43 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.population; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; - -import java.util.ArrayList; -import java.util.List; - -/** - * A population initializer that build an empty population. - * - * @author Alexandre Boulanger - */ -public class EmptyPopulationInitializer implements PopulationInitializer { - - /** - * Initialize an empty population - * - * @param size The maximum size of the population. - * @return The initialized population. - */ - @Override - public List getInitializedPopulation(int size) { - return new ArrayList<>(size); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationInitializer.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationInitializer.java deleted file mode 100644 index 41d3c7e63..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationInitializer.java +++ /dev/null @@ -1,38 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.population; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; - -import java.util.List; - -/** - * An initializer that construct the population used by the population model. - * - * @author Alexandre Boulanger - */ -public interface PopulationInitializer { - /** - * Called by the population model to construct the population - * - * @param size The maximum size of the population - * @return An initialized population - */ - List getInitializedPopulation(int size); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationListener.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationListener.java deleted file mode 100644 index efd542944..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationListener.java +++ /dev/null @@ -1,37 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.population; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; - -import java.util.List; - -/** - * A listener that is called when the population changes. - * - * @author Alexandre Boulanger - */ -public interface PopulationListener { - /** - * Called after the population has changed. - * - * @param population The population after it has changed. - */ - void onChanged(List population); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationModel.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationModel.java deleted file mode 100644 index 4234dab58..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/population/PopulationModel.java +++ /dev/null @@ -1,184 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.population; - -import lombok.Getter; -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.culling.CullOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.culling.LeastFitCullOperator; - -import java.util.ArrayList; -import java.util.Collections; -import java.util.Comparator; -import java.util.List; - -/** - * The population model handles all aspects of the population (initialization, additions and culling) - * - * @author Alexandre Boulanger - */ -public class PopulationModel { - private static final int DEFAULT_POPULATION_SIZE = 30; - - private final CullOperator cullOperator; - private final List populationListeners = new ArrayList<>(); - private Comparator chromosomeComparator; - - /** - * The maximum population size - */ - @Getter - private final int populationSize; - - /** - * The population - */ - @Getter - public final List population; - - /** - * A comparator used when higher fitness value is better - */ - public static class MaximizeScoreComparator implements Comparator { - @Override - public int compare(Chromosome lhs, Chromosome rhs) { - return -Double.compare(lhs.getFitness(), rhs.getFitness()); - } - } - - /** - * A comparator used when lower fitness value is better - */ - public static class MinimizeScoreComparator implements Comparator { - @Override - public int compare(Chromosome lhs, Chromosome rhs) { - return Double.compare(lhs.getFitness(), rhs.getFitness()); - } - } - - public static class Builder { - private int populationSize = DEFAULT_POPULATION_SIZE; - private PopulationInitializer populationInitializer; - private CullOperator cullOperator; - - /** - * Use an alternate population initialization behavior. Default is empty population. - * - * @param populationInitializer An instance of PopulationInitializer - */ - public Builder populationInitializer(PopulationInitializer populationInitializer) { - this.populationInitializer = populationInitializer; - return this; - } - - /** - * The maximum population size.
- * If using a ratio based culling, using a population with culled size of around 1.5 to 2 times the number of genes generally gives good results. - * (e.g. For a chromosome having 10 genes, the culled size should be between 15 and 20. And with a cull ratio of 1/3 we should set the population size to 23 to 30. (15 / (1 - 1/3)), rounded up) - * - * @param size The maximum size of the population - */ - public Builder populationSize(int size) { - populationSize = size; - return this; - } - - /** - * Use an alternate cull operator behavior. Default is least fit culling. - * - * @param cullOperator An instance of a CullOperator - */ - public Builder cullOperator(CullOperator cullOperator) { - this.cullOperator = cullOperator; - return this; - } - - public PopulationModel build() { - if (cullOperator == null) { - cullOperator = new LeastFitCullOperator(); - } - - if (populationInitializer == null) { - populationInitializer = new EmptyPopulationInitializer(); - } - - return new PopulationModel(this); - } - - } - - public PopulationModel(PopulationModel.Builder builder) { - populationSize = builder.populationSize; - population = new ArrayList<>(builder.populationSize); - PopulationInitializer populationInitializer = builder.populationInitializer; - - List initializedPopulation = populationInitializer.getInitializedPopulation(populationSize); - population.clear(); - population.addAll(initializedPopulation); - - cullOperator = builder.cullOperator; - cullOperator.initializeInstance(this); - } - - /** - * Called by the GeneticSearchCandidateGenerator - */ - public void initializeInstance(boolean minimizeScore) { - chromosomeComparator = minimizeScore ? new MinimizeScoreComparator() : new MaximizeScoreComparator(); - } - - /** - * Add a PopulationListener to the list of change listeners - * @param listener A PopulationListener instance - */ - public void addListener(PopulationListener listener) { - populationListeners.add(listener); - } - - /** - * Add a Chromosome to the population and call the PopulationListeners. Culling may be triggered. - * - * @param element The chromosome to be added - */ - public void add(Chromosome element) { - if (population.size() == populationSize) { - cullOperator.cullPopulation(); - } - - population.add(element); - - Collections.sort(population, chromosomeComparator); - - triggerPopulationChangedListeners(population); - } - - /** - * @return Return false when the population is below the culled size, otherwise true.
- * Used by the selection operator to know if the population is still too small and should generate random genes. - */ - public boolean isReadyToBreed() { - return population.size() >= cullOperator.getCulledSize(); - } - - private void triggerPopulationChangedListeners(List population) { - for (PopulationListener listener : populationListeners) { - listener.onChanged(population); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/selection/GeneticSelectionOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/selection/GeneticSelectionOperator.java deleted file mode 100644 index b46a974ac..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/selection/GeneticSelectionOperator.java +++ /dev/null @@ -1,199 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.selection; - -import org.apache.commons.math3.random.JDKRandomGenerator; -import org.apache.commons.math3.random.RandomGenerator; -import org.apache.commons.math3.random.SynchronizedRandomGenerator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.ChromosomeFactory; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverResult; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.SinglePointCrossover; -import org.deeplearning4j.arbiter.optimize.generator.genetic.exceptions.GeneticGenerationException; -import org.deeplearning4j.arbiter.optimize.generator.genetic.mutation.MutationOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.mutation.RandomMutationOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; - -import java.util.Arrays; - -/** - * A selection operator that will generate random genes initially. Once the population has reached the culled size, - * will start to generate offsprings of parents selected in the population. - * - * @author Alexandre Boulanger - */ -public class GeneticSelectionOperator extends SelectionOperator { - - private final static int PREVIOUS_GENES_TO_KEEP = 100; - private final static int MAX_NUM_GENERATION_ATTEMPTS = 1024; - - private final CrossoverOperator crossoverOperator; - private final MutationOperator mutationOperator; - private final RandomGenerator rng; - private double[][] previousGenes = new double[PREVIOUS_GENES_TO_KEEP][]; - private int previousGenesIdx = 0; - - public static class Builder { - private ChromosomeFactory chromosomeFactory; - private PopulationModel populationModel; - private CrossoverOperator crossoverOperator; - private MutationOperator mutationOperator; - private RandomGenerator rng; - - /** - * Use an alternate crossover behavior. Default is SinglePointCrossover. - * - * @param crossoverOperator An instance of CrossoverOperator - */ - public Builder crossoverOperator(CrossoverOperator crossoverOperator) { - this.crossoverOperator = crossoverOperator; - return this; - } - - /** - * Use an alternate mutation behavior. Default is RandomMutationOperator. - * - * @param mutationOperator An instance of MutationOperator - */ - public Builder mutationOperator(MutationOperator mutationOperator) { - this.mutationOperator = mutationOperator; - return this; - } - - /** - * Use a supplied RandomGenerator - * - * @param rng An instance of RandomGenerator - */ - public Builder randomGenerator(RandomGenerator rng) { - this.rng = rng; - return this; - } - - public GeneticSelectionOperator build() { - if (crossoverOperator == null) { - crossoverOperator = new SinglePointCrossover.Builder().build(); - } - - if (mutationOperator == null) { - mutationOperator = new RandomMutationOperator.Builder().build(); - } - - if (rng == null) { - rng = new SynchronizedRandomGenerator(new JDKRandomGenerator()); - } - - return new GeneticSelectionOperator(crossoverOperator, mutationOperator, rng); - } - } - - private GeneticSelectionOperator(CrossoverOperator crossoverOperator, MutationOperator mutationOperator, - RandomGenerator rng) { - this.crossoverOperator = crossoverOperator; - this.mutationOperator = mutationOperator; - this.rng = rng; - } - - /** - * Called by GeneticSearchCandidateGenerator - */ - @Override - public void initializeInstance(PopulationModel populationModel, ChromosomeFactory chromosomeFactory) { - super.initializeInstance(populationModel, chromosomeFactory); - crossoverOperator.initializeInstance(populationModel); - } - - /** - * Build a new set of genes. Has two distinct modes of operation - *

- * @return Returns the generated set of genes - * @throws GeneticGenerationException If buildNextGenes() can't generate a set that has not already been tried, - * or if the crossover and the mutation operators can't generate a set, - * this exception is thrown. - */ - @Override - public double[] buildNextGenes() { - double[] result; - - boolean hasAlreadyBeenTried; - int attemptsRemaining = MAX_NUM_GENERATION_ATTEMPTS; - do { - if (populationModel.isReadyToBreed()) { - result = buildOffspring(); - } else { - result = buildRandomGenes(); - } - - hasAlreadyBeenTried = hasAlreadyBeenTried(result); - if (hasAlreadyBeenTried && --attemptsRemaining == 0) { - throw new GeneticGenerationException("Failed to generate a set of genes not already tried."); - } - } while (hasAlreadyBeenTried); - - previousGenes[previousGenesIdx] = result; - previousGenesIdx = ++previousGenesIdx % previousGenes.length; - - return result; - } - - private boolean hasAlreadyBeenTried(double[] genes) { - for (int i = 0; i < previousGenes.length; ++i) { - double[] current = previousGenes[i]; - if (current != null && Arrays.equals(current, genes)) { - return true; - } - } - - return false; - } - - private double[] buildOffspring() { - double[] offspringValues; - - boolean isModified; - int attemptsRemaining = MAX_NUM_GENERATION_ATTEMPTS; - do { - CrossoverResult crossoverResult = crossoverOperator.crossover(); - offspringValues = crossoverResult.getGenes(); - isModified = crossoverResult.isModified(); - isModified |= mutationOperator.mutate(offspringValues); - - if (!isModified && --attemptsRemaining == 0) { - throw new GeneticGenerationException( - String.format("Crossover and mutation operators failed to generate a new set of genes after %s attempts.", - MAX_NUM_GENERATION_ATTEMPTS)); - } - } while (!isModified); - - return offspringValues; - } - - private double[] buildRandomGenes() { - double[] randomValues = new double[chromosomeFactory.getChromosomeLength()]; - for (int i = 0; i < randomValues.length; ++i) { - randomValues[i] = rng.nextDouble(); - } - - return randomValues; - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/selection/SelectionOperator.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/selection/SelectionOperator.java deleted file mode 100644 index 082a759d2..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/genetic/selection/SelectionOperator.java +++ /dev/null @@ -1,46 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.genetic.selection; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.ChromosomeFactory; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; - -/** - * An abstract class for all selection operators. Used by the GeneticSearchCandidateGenerator to generate new candidates. - * - * @author Alexandre Boulanger - */ -public abstract class SelectionOperator { - protected PopulationModel populationModel; - protected ChromosomeFactory chromosomeFactory; - - /** - * Called by GeneticSearchCandidateGenerator - */ - public void initializeInstance(PopulationModel populationModel, ChromosomeFactory chromosomeFactory) { - - this.populationModel = populationModel; - this.chromosomeFactory = chromosomeFactory; - } - - /** - * Generate a new set of genes. - */ - public abstract double[] buildNextGenes(); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/util/SerializedSupplier.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/util/SerializedSupplier.java deleted file mode 100644 index ad5aa8b5a..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/generator/util/SerializedSupplier.java +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.generator.util; - -import org.nd4j.common.function.Supplier; - -import java.io.*; - -public class SerializedSupplier implements Serializable, Supplier { - - private byte[] asBytes; - - public SerializedSupplier(T obj){ - try(ByteArrayOutputStream baos = new ByteArrayOutputStream(); ObjectOutputStream oos = new ObjectOutputStream(baos)){ - oos.writeObject(obj); - oos.flush(); - oos.close(); - asBytes = baos.toByteArray(); - } catch (Exception e){ - throw new RuntimeException("Error serializing object - must be serializable",e); - } - } - - @Override - public T get() { - try(ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(asBytes))){ - return (T)ois.readObject(); - } catch (Exception e){ - throw new RuntimeException("Error deserializing object",e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/BooleanSpace.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/BooleanSpace.java deleted file mode 100644 index 8fccb25fa..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/BooleanSpace.java +++ /dev/null @@ -1,78 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter; - -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -/** - * BooleanParameterSpace is a {@code ParameterSpace}; Defines {True, False} as a parameter space - * If argument to setValue is less than or equal to 0.5 it will return True else False - * - * @author susaneraly - */ -@EqualsAndHashCode -public class BooleanSpace implements ParameterSpace { - private int index = -1; - - @Override - public Boolean getValue(double[] input) { - if (index == -1) { - throw new IllegalStateException("Cannot get value: ParameterSpace index has not been set"); - } - if (input[index] <= 0.5) return Boolean.TRUE; - else return Boolean.FALSE; - } - - @Override - public int numParameters() { - return 1; - } - - @Override - public List collectLeaves() { - return Collections.singletonList((ParameterSpace) this); - } - - @Override - public Map getNestedSpaces() { - return Collections.emptyMap(); - } - - @Override - public boolean isLeaf() { - return true; - } - - @Override - public void setIndices(int... indices) { - if (indices == null || indices.length != 1) - throw new IllegalArgumentException("Invalid index"); - this.index = indices[0]; - } - - @Override - public String toString() { - return "BooleanSpace()"; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/FixedValue.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/FixedValue.java deleted file mode 100644 index 9ee607956..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/FixedValue.java +++ /dev/null @@ -1,92 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter; - -import lombok.EqualsAndHashCode; -import lombok.Getter; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.serde.jackson.FixedValueDeserializer; -import org.deeplearning4j.arbiter.optimize.serde.jackson.FixedValueSerializer; -import org.deeplearning4j.arbiter.util.ObjectUtils; -import org.nd4j.shade.jackson.annotation.JsonCreator; -import org.nd4j.shade.jackson.annotation.JsonProperty; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; -import org.nd4j.shade.jackson.databind.annotation.JsonDeserialize; -import org.nd4j.shade.jackson.databind.annotation.JsonSerialize; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -/** - * FixedValue is a ParameterSpace that defines only a single fixed value - * - * @param Type of (fixed) value - */ -@EqualsAndHashCode -@JsonSerialize(using = FixedValueSerializer.class) -@JsonDeserialize(using = FixedValueDeserializer.class) -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -public class FixedValue implements ParameterSpace { - @Getter - private Object value; - private int index; - - @JsonCreator - public FixedValue(@JsonProperty("value") T value) { - this.value = value; - } - - @Override - public String toString() { - return "FixedValue(" + ObjectUtils.valueToString(value) + ")"; - } - - @Override - public T getValue(double[] input) { - return (T) value; - } - - @Override - public int numParameters() { - return 0; - } - - @Override - public List collectLeaves() { - return Collections.emptyList(); - } - - @Override - public Map getNestedSpaces() { - return Collections.emptyMap(); - } - - @Override - public boolean isLeaf() { - return true; - } - - @Override - public void setIndices(int... indices) { - if (indices != null && indices.length != 0) - throw new IllegalArgumentException( - "Invalid call: FixedValue ParameterSpace " + "should not be given an index (0 params)"); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/continuous/ContinuousParameterSpace.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/continuous/ContinuousParameterSpace.java deleted file mode 100644 index 4fa056744..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/continuous/ContinuousParameterSpace.java +++ /dev/null @@ -1,137 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter.continuous; - -import org.apache.commons.math3.distribution.RealDistribution; -import org.apache.commons.math3.distribution.UniformRealDistribution; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.distribution.DistributionUtils; -import org.deeplearning4j.arbiter.optimize.serde.jackson.RealDistributionDeserializer; -import org.deeplearning4j.arbiter.optimize.serde.jackson.RealDistributionSerializer; -import org.nd4j.shade.jackson.annotation.JsonProperty; -import org.nd4j.shade.jackson.databind.annotation.JsonDeserialize; -import org.nd4j.shade.jackson.databind.annotation.JsonSerialize; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -/** - * ContinuousParametSpace is a {@code ParameterSpace} that (optionally) takes an Apache Commons - * {@link RealDistribution} when used for random sampling (such as in a RandomSearchCandidateGenerator) - * - * @author Alex Black - */ -public class ContinuousParameterSpace implements ParameterSpace { - - //Need to use custom serializers/deserializers for commons RealDistribution instances - @JsonSerialize(using = RealDistributionSerializer.class) - @JsonDeserialize(using = RealDistributionDeserializer.class) - private RealDistribution distribution; - private int index = -1; - - /** - * ContinuousParameterSpace with uniform distribution between the minimum and maximum values - * - * @param min Minimum value that can be generated - * @param max Maximum value that can be generated - */ - public ContinuousParameterSpace(double min, double max) { - this(new UniformRealDistribution(min, max)); - } - - /** - * ConditiousParameterSpcae wiht a specified probability distribution. The provided distribution defines the min/max - * values, and (for random search, etc) will be used when generating random values - * - * @param distribution Distribution to sample from - */ - public ContinuousParameterSpace(@JsonProperty("distribution") RealDistribution distribution) { - this.distribution = distribution; - } - - - @Override - public Double getValue(double[] input) { - if (index == -1) { - throw new IllegalStateException("Cannot get value: ParameterSpace index has not been set"); - } - return distribution.inverseCumulativeProbability(input[index]); - } - - @Override - public int numParameters() { - return 1; - } - - @Override - public List collectLeaves() { - return Collections.singletonList((ParameterSpace) this); - } - - @Override - public Map getNestedSpaces() { - return Collections.emptyMap(); - } - - @Override - public boolean isLeaf() { - return true; - } - - @Override - public void setIndices(int... indices) { - if (indices == null || indices.length != 1) { - throw new IllegalArgumentException("Invalid index"); - } - this.index = indices[0]; - } - - - @Override - public String toString() { - if (distribution instanceof UniformRealDistribution) { - return "ContinuousParameterSpace(min=" + distribution.getSupportLowerBound() + ",max=" - + distribution.getSupportUpperBound() + ")"; - } else { - return "ContinuousParameterSpace(" + distribution + ")"; - } - } - - public boolean equals(Object o) { - if (o == this) - return true; - if (!(o instanceof ContinuousParameterSpace)) - return false; - final ContinuousParameterSpace other = (ContinuousParameterSpace) o; - if (distribution == null ? other.distribution != null - : !DistributionUtils.distributionsEqual(distribution, other.distribution)) - return false; - - return this.index == other.index; - } - - public int hashCode() { - final int PRIME = 59; - int result = 1; - result = result * PRIME + (distribution == null ? 43 : distribution.getClass().hashCode()); - result = result * PRIME + this.index; - return result; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/discrete/DiscreteParameterSpace.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/discrete/DiscreteParameterSpace.java deleted file mode 100644 index 5e113b371..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/discrete/DiscreteParameterSpace.java +++ /dev/null @@ -1,114 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter.discrete; - -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.util.ObjectUtils; -import org.nd4j.shade.jackson.annotation.JsonProperty; -import org.nd4j.shade.jackson.databind.annotation.JsonSerialize; - -import java.util.*; - -/** - * A DiscreteParameterSpace is used for a set of un-ordered values - * - * @param

Parameter type - * @author Alex Black - */ -@EqualsAndHashCode -public class DiscreteParameterSpace

implements ParameterSpace

{ - - @JsonSerialize - private List

values; - private int index = -1; - - public DiscreteParameterSpace(@JsonProperty("values") P... values) { - if (values != null) - this.values = Arrays.asList(values); - } - - public DiscreteParameterSpace(Collection

values) { - this.values = new ArrayList<>(values); - } - - public int numValues() { - return values.size(); - } - - @Override - public P getValue(double[] input) { - if (index == -1) { - throw new IllegalStateException("Cannot get value: ParameterSpace index has not been set"); - } - if (values == null) - throw new IllegalStateException("Values are null."); - //Map a value in range [0,1] to one of the list of values - //First value: [0,width], second: (width,2*width], third: (3*width,4*width] etc - int size = values.size(); - if (size == 1) - return values.get(0); - double width = 1.0 / size; - int val = (int) (input[index] / width); - return values.get(Math.min(val, size - 1)); - } - - @Override - public int numParameters() { - return 1; - } - - @Override - public List collectLeaves() { - return Collections.singletonList((ParameterSpace) this); - } - - @Override - public Map getNestedSpaces() { - return Collections.emptyMap(); - } - - @Override - public boolean isLeaf() { - return true; - } - - @Override - public void setIndices(int... indices) { - if (indices == null || indices.length != 1) { - throw new IllegalArgumentException("Invalid index"); - } - this.index = indices[0]; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder(); - sb.append("DiscreteParameterSpace("); - int n = values.size(); - for (int i = 0; i < n; i++) { - P value = values.get(i); - sb.append(ObjectUtils.valueToString(value)); - sb.append((i == n - 1 ? ")" : ",")); - } - return sb.toString(); - } - - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/integer/IntegerParameterSpace.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/integer/IntegerParameterSpace.java deleted file mode 100644 index ea9a1f784..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/integer/IntegerParameterSpace.java +++ /dev/null @@ -1,151 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter.integer; - -import lombok.NoArgsConstructor; -import org.apache.commons.math3.distribution.IntegerDistribution; -import org.apache.commons.math3.distribution.UniformIntegerDistribution; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.distribution.DistributionUtils; -import org.deeplearning4j.arbiter.optimize.serde.jackson.IntegerDistributionDeserializer; -import org.deeplearning4j.arbiter.optimize.serde.jackson.IntegerDistributionSerializer; -import org.nd4j.shade.jackson.annotation.JsonCreator; -import org.nd4j.shade.jackson.annotation.JsonIgnoreProperties; -import org.nd4j.shade.jackson.annotation.JsonProperty; -import org.nd4j.shade.jackson.databind.annotation.JsonDeserialize; -import org.nd4j.shade.jackson.databind.annotation.JsonSerialize; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -/** - * IntegerParameterSpace is a {@code ParameterSpace}; i.e., defines an ordered space of integers between - * some minimum and maximum value - * - * @author Alex Black - */ -@JsonIgnoreProperties({"min", "max"}) -@NoArgsConstructor -public class IntegerParameterSpace implements ParameterSpace { - - @JsonSerialize(using = IntegerDistributionSerializer.class) - @JsonDeserialize(using = IntegerDistributionDeserializer.class) - private IntegerDistribution distribution; - private int index = -1; - - /** - * Create an IntegerParameterSpace with a uniform distribution between the specified min/max (inclusive) - * - * @param min Min value, inclusive - * @param max Max value, inclusive - */ - public IntegerParameterSpace(int min, int max) { - this(new UniformIntegerDistribution(min, max)); - } - - /** - * Crate an IntegerParametSpace from the given IntegerDistribution - * - * @param distribution Distribution to use - */ - @JsonCreator - public IntegerParameterSpace(@JsonProperty("distribution") IntegerDistribution distribution) { - this.distribution = distribution; - } - - public int getMin() { - return distribution.getSupportLowerBound(); - } - - public int getMax() { - return distribution.getSupportUpperBound(); - } - - @Override - public Integer getValue(double[] input) { - if (index == -1) { - throw new IllegalStateException("Cannot get value: ParameterSpace index has not been set"); - } - return distribution.inverseCumulativeProbability(input[index]); - } - - @Override - public int numParameters() { - return 1; - } - - @Override - public List collectLeaves() { - return Collections.singletonList((ParameterSpace) this); - } - - @Override - public Map getNestedSpaces() { - return Collections.emptyMap(); - } - - @Override - public boolean isLeaf() { - return true; - } - - @Override - public void setIndices(int... indices) { - if (indices == null || indices.length != 1) - throw new IllegalArgumentException("Invalid index"); - this.index = indices[0]; - } - - @Override - public String toString() { - if (distribution instanceof UniformIntegerDistribution) { - return "IntegerParameterSpace(min=" + distribution.getSupportLowerBound() + ",max=" - + distribution.getSupportUpperBound() + ")"; - } else { - return "IntegerParameterSpace(" + distribution + ")"; - } - } - - public boolean equals(Object o) { - if (o == this) - return true; - if (!(o instanceof IntegerParameterSpace)) - return false; - final IntegerParameterSpace other = (IntegerParameterSpace) o; - if (!other.canEqual(this)) - return false; - if (distribution == null ? other.distribution != null - : !DistributionUtils.distributionEquals(distribution, other.distribution)) - return false; - return this.index == other.index; - } - - public int hashCode() { - final int PRIME = 59; - int result = 1; - result = result * PRIME + (distribution == null ? 43 : distribution.getClass().hashCode()); - result = result * PRIME + this.index; - return result; - } - - protected boolean canEqual(Object other) { - return other instanceof IntegerParameterSpace; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/MathOp.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/MathOp.java deleted file mode 100644 index 632cf4bcf..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/MathOp.java +++ /dev/null @@ -1,71 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter.math; - -import org.deeplearning4j.arbiter.optimize.api.AbstractParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; - -import java.util.List; - -/** - * A simple parameter space that implements scalar mathematical operations on another parameter space. This allows you - * to do things like Y = X * 2, where X is a parameter space. For example, a layer size hyperparameter could be set - * using this to 2x the size of the previous layer - * - * @param Type of the parameter space - * @author Alex Black - */ -public class MathOp extends AbstractParameterSpace { - - private ParameterSpace parameterSpace; - private Op op; - private T scalar; - - public MathOp(ParameterSpace parameterSpace, Op op, T scalar){ - this.parameterSpace = parameterSpace; - this.op = op; - this.scalar = scalar; - } - - @Override - public T getValue(double[] parameterValues) { - T u = parameterSpace.getValue(parameterValues); - return op.doOp(u, scalar); - } - - @Override - public int numParameters() { - return parameterSpace.numParameters(); - } - - @Override - public List collectLeaves() { - return parameterSpace.collectLeaves(); - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - parameterSpace.setIndices(indices); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/Op.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/Op.java deleted file mode 100644 index 9faae0182..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/Op.java +++ /dev/null @@ -1,78 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter.math; - -public enum Op { - ADD, SUB, MUL, DIV; - - - //Package private - T doOp(T first, T second){ - if(first instanceof Integer || first instanceof Long){ - long result; - switch (this){ - case ADD: - result = Long.valueOf(first.longValue() + second.longValue()); - break; - case SUB: - result = Long.valueOf(first.longValue() - second.longValue()); - break; - case MUL: - result = Long.valueOf(first.longValue() * second.longValue()); - break; - case DIV: - result = Long.valueOf(first.longValue() / second.longValue()); - break; - default: - throw new UnsupportedOperationException("Unknown op: " + this); - } - if(first instanceof Long){ - return (T)Long.valueOf(result); - } else { - return (T)Integer.valueOf((int)result); - } - } else if(first instanceof Double || first instanceof Float){ - double result; - switch (this){ - case ADD: - result = Double.valueOf(first.doubleValue() + second.doubleValue()); - break; - case SUB: - result = Double.valueOf(first.doubleValue() - second.doubleValue()); - break; - case MUL: - result = Double.valueOf(first.doubleValue() * second.doubleValue()); - break; - case DIV: - result = Double.valueOf(first.doubleValue() / second.doubleValue()); - break; - default: - throw new UnsupportedOperationException("Unknown op: " + this); - } - if(first instanceof Double){ - return (T)Double.valueOf(result); - } else { - return (T)Float.valueOf((float)result); - } - } else { - throw new UnsupportedOperationException("Not supported type: only Integer, Long, Double, Float supported" + - " here. Got type: " + first.getClass()); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/PairMathOp.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/PairMathOp.java deleted file mode 100644 index 24a6afa26..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/parameter/math/PairMathOp.java +++ /dev/null @@ -1,81 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter.math; - -import org.deeplearning4j.arbiter.optimize.api.AbstractParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; - -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; - -/** - * A simple parameter space that implements pairwise mathematical operations on another parameter space. This allows you - * to do things like Z = X + Y, where X and Y are parameter spaces. - * - * @param Type of the parameter space - * @author Alex Black - */ -public class PairMathOp extends AbstractParameterSpace { - - private ParameterSpace first; - private ParameterSpace second; - private Op op; - - public PairMathOp(ParameterSpace first, ParameterSpace second, Op op){ - this.first = first; - this.second = second; - this.op = op; - } - - @Override - public T getValue(double[] parameterValues) { - T f = first.getValue(parameterValues); - T s = second.getValue(parameterValues); - return op.doOp(f, s); - } - - @Override - public int numParameters() { - return first.numParameters() + second.numParameters(); - } - - @Override - public List collectLeaves() { - List l = new ArrayList<>(); - l.addAll(first.collectLeaves()); - l.addAll(second.collectLeaves()); - return l; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - int n1 = first.numParameters(); - int n2 = second.numParameters(); - int[] s1 = Arrays.copyOfRange(indices, 0, n1); - int[] s2 = Arrays.copyOfRange(indices, n1, n1+n2); - first.setIndices(s1); - second.setIndices(s2); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/BaseOptimizationRunner.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/BaseOptimizationRunner.java deleted file mode 100644 index 292c64467..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/BaseOptimizationRunner.java +++ /dev/null @@ -1,381 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner; - -import org.nd4j.shade.guava.util.concurrent.ListenableFuture; -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang3.exception.ExceptionUtils; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.api.termination.TerminationCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; - -import java.util.*; -import java.util.concurrent.*; -import java.util.concurrent.atomic.AtomicInteger; -import java.util.concurrent.atomic.AtomicLong; - -/** - * BaseOptimization runner: responsible for scheduling tasks, saving results using the result saver, etc. - * - * @author Alex Black - */ -@Slf4j -public abstract class BaseOptimizationRunner implements IOptimizationRunner { - private static final int POLLING_FREQUENCY = 1; - private static final TimeUnit POLLING_FREQUENCY_UNIT = TimeUnit.SECONDS; - - protected OptimizationConfiguration config; - protected Queue> queuedFutures = new ConcurrentLinkedQueue<>(); - protected BlockingQueue> completedFutures = new LinkedBlockingQueue<>(); - protected AtomicInteger totalCandidateCount = new AtomicInteger(); - protected AtomicInteger numCandidatesCompleted = new AtomicInteger(); - protected AtomicInteger numCandidatesFailed = new AtomicInteger(); - protected Double bestScore = null; - protected Long bestScoreTime = null; - protected AtomicInteger bestScoreCandidateIndex = new AtomicInteger(-1); - protected List allResults = new ArrayList<>(); - - protected Map currentStatus = new ConcurrentHashMap<>(); //TODO: better design possible? - - protected ExecutorService futureListenerExecutor; - - protected List statusListeners = new ArrayList<>(); - - - protected BaseOptimizationRunner(OptimizationConfiguration config) { - this.config = config; - - if (config.getTerminationConditions() == null || config.getTerminationConditions().size() == 0) { - throw new IllegalArgumentException("Cannot create BaseOptimizationRunner without TerminationConditions (" - + "termination conditions are null or empty)"); - } - - } - - protected void init() { - futureListenerExecutor = Executors.newFixedThreadPool(maxConcurrentTasks(), new ThreadFactory() { - private AtomicLong counter = new AtomicLong(0); - - @Override - public Thread newThread(Runnable r) { - Thread t = Executors.defaultThreadFactory().newThread(r); - t.setDaemon(true); - t.setName("ArbiterOptimizationRunner-" + counter.getAndIncrement()); - return t; - } - }); - } - - /** - * - */ - @Override - public void execute() { - log.info("{}: execution started", this.getClass().getSimpleName()); - config.setExecutionStartTime(System.currentTimeMillis()); - for (StatusListener listener : statusListeners) { - listener.onInitialization(this); - } - - //Initialize termination conditions (start timers, etc) - for (TerminationCondition c : config.getTerminationConditions()) { - c.initialize(this); - } - - //Queue initial tasks: - List> tempList = new ArrayList<>(100); - while (true) { - //Otherwise: add tasks if required - Future future = null; - try { - future = completedFutures.poll(POLLING_FREQUENCY, POLLING_FREQUENCY_UNIT); - } catch (InterruptedException e) { - //No op? - } - if (future != null) { - tempList.add(future); - } - completedFutures.drainTo(tempList); - - //Process results (if any) - for (Future f : tempList) { - queuedFutures.remove(f); - processReturnedTask(f); - } - - if (tempList.size() > 0) { - for (StatusListener sl : statusListeners) { - sl.onRunnerStatusChange(this); - } - } - tempList.clear(); - - //Check termination conditions: - if (terminate()) { - shutdown(true); - break; - } - - //Add additional tasks - while (config.getCandidateGenerator().hasMoreCandidates() && queuedFutures.size() < maxConcurrentTasks()) { - Candidate candidate = config.getCandidateGenerator().getCandidate(); - CandidateInfo status; - if (candidate.getException() != null) { - //Failed on generation... - status = processFailedCandidates(candidate); - } else { - long created = System.currentTimeMillis(); - ListenableFuture f; - if(config.getDataSource() != null){ - f = execute(candidate, config.getDataSource(), config.getDataSourceProperties(), config.getScoreFunction()); - } else { - f = execute(candidate, config.getDataProvider(), config.getScoreFunction()); - } - f.addListener(new OnCompletionListener(f), futureListenerExecutor); - queuedFutures.add(f); - totalCandidateCount.getAndIncrement(); - - status = new CandidateInfo(candidate.getIndex(), CandidateStatus.Created, null, - created, null, null, candidate.getFlatParameters(), null); - currentStatus.put(candidate.getIndex(), status); - } - - for (StatusListener listener : statusListeners) { - listener.onCandidateStatusChange(status, this, null); - } - } - } - - //Process any final (completed) tasks: - completedFutures.drainTo(tempList); - for (Future f : tempList) { - queuedFutures.remove(f); - processReturnedTask(f); - } - tempList.clear(); - - log.info("Optimization runner: execution complete"); - for (StatusListener listener : statusListeners) { - listener.onShutdown(this); - } - } - - - private CandidateInfo processFailedCandidates(Candidate candidate) { - //In case the candidate fails during the creation of the candidate - - long time = System.currentTimeMillis(); - String stackTrace = ExceptionUtils.getStackTrace(candidate.getException()); - CandidateInfo newStatus = new CandidateInfo(candidate.getIndex(), CandidateStatus.Failed, null, time, time, - time, candidate.getFlatParameters(), stackTrace); - currentStatus.put(candidate.getIndex(), newStatus); - - return newStatus; - } - - /** - * Process returned task (either completed or failed - */ - private void processReturnedTask(Future future) { - long currentTime = System.currentTimeMillis(); - OptimizationResult result; - try { - result = future.get(100, TimeUnit.MILLISECONDS); - } catch (InterruptedException e) { - throw new RuntimeException("Unexpected InterruptedException thrown for task", e); - } catch (ExecutionException e) { - //Note that most of the time, an OptimizationResult is returned even for an exception - //This is just to handle any that are missed there (or, by implementations that don't properly do this) - log.warn("Task failed", e); - - numCandidatesFailed.getAndIncrement(); - return; - } catch (TimeoutException e) { - throw new RuntimeException(e); //TODO - } - - //Update internal status: - CandidateInfo status = currentStatus.get(result.getIndex()); - CandidateInfo newStatus = new CandidateInfo(result.getIndex(), result.getCandidateInfo().getCandidateStatus(), - result.getScore(), status.getCreatedTime(), result.getCandidateInfo().getStartTime(), - currentTime, status.getFlatParams(), result.getCandidateInfo().getExceptionStackTrace()); - currentStatus.put(result.getIndex(), newStatus); - - //Listeners (on complete, etc) should be executed in underlying task - - - if (result.getCandidateInfo().getCandidateStatus() == CandidateStatus.Failed) { - log.info("Task {} failed during execution: {}", result.getIndex(), result.getCandidateInfo().getExceptionStackTrace()); - numCandidatesFailed.getAndIncrement(); - } else { - - //Report completion to candidate generator - config.getCandidateGenerator().reportResults(result); - - Double score = result.getScore(); - log.info("Completed task {}, score = {}", result.getIndex(), result.getScore()); - - boolean minimize = config.getScoreFunction().minimize(); - if (score != null && (bestScore == null - || ((minimize && score < bestScore) || (!minimize && score > bestScore)))) { - if (bestScore == null) { - log.info("New best score: {} (first completed model)", score); - } else { - int idx = result.getIndex(); - int lastBestIdx = bestScoreCandidateIndex.get(); - log.info("New best score: {}, model {} (prev={}, model {})", score, idx, bestScore, lastBestIdx); - } - bestScore = score; - bestScoreTime = System.currentTimeMillis(); - bestScoreCandidateIndex.set(result.getIndex()); - } - numCandidatesCompleted.getAndIncrement(); - - //Model saving is done in the optimization tasks, to avoid CUDA threading issues - ResultReference resultReference = result.getResultReference(); - - if (resultReference != null) - allResults.add(resultReference); - } - } - - @Override - public int numCandidatesTotal() { - return totalCandidateCount.get(); - } - - @Override - public int numCandidatesCompleted() { - return numCandidatesCompleted.get(); - } - - @Override - public int numCandidatesFailed() { - return numCandidatesFailed.get(); - } - - @Override - public int numCandidatesQueued() { - return queuedFutures.size(); - } - - @Override - public Double bestScore() { - return bestScore; - } - - @Override - public Long bestScoreTime() { - return bestScoreTime; - } - - @Override - public int bestScoreCandidateIndex() { - return bestScoreCandidateIndex.get(); - } - - @Override - public List getResults() { - return new ArrayList<>(allResults); - } - - @Override - public OptimizationConfiguration getConfiguration() { - return config; - } - - - @Override - public void addListeners(StatusListener... listeners) { - for (StatusListener l : listeners) { - if (!statusListeners.contains(l)) { - statusListeners.add(l); - } - } - } - - @Override - public void removeListeners(StatusListener... listeners) { - for (StatusListener l : listeners) { - statusListeners.remove(l); - } - } - - @Override - public void removeAllListeners() { - statusListeners.clear(); - } - - @Override - public List getCandidateStatus() { - return new ArrayList<>(currentStatus.values()); - } - - private boolean terminate() { - for (TerminationCondition c : config.getTerminationConditions()) { - if (c.terminate(this)) { - log.info("BaseOptimizationRunner global termination condition hit: {}", c); - return true; - } - } - return false; - } - - @AllArgsConstructor - @Data - private class FutureDetails { - private final Future future; - private final long startTime; - private final int index; - } - - @AllArgsConstructor - private class OnCompletionListener implements Runnable { - private Future future; - - @Override - public void run() { - completedFutures.add(future); - } - } - - - protected abstract int maxConcurrentTasks(); - - @Deprecated - protected abstract ListenableFuture execute(Candidate candidate, DataProvider dataProvider, - ScoreFunction scoreFunction); - @Deprecated - protected abstract List> execute(List candidates, - DataProvider dataProvider, ScoreFunction scoreFunction); - - protected abstract ListenableFuture execute(Candidate candidate, Class dataSource, - Properties dataSourceProperties, ScoreFunction scoreFunction); - - protected abstract List> execute(List candidates, Class dataSource, - Properties dataSourceProperties, ScoreFunction scoreFunction); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/CandidateInfo.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/CandidateInfo.java deleted file mode 100644 index e179a61f3..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/CandidateInfo.java +++ /dev/null @@ -1,43 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner; - -import lombok.AllArgsConstructor; -import lombok.Data; - -/** - * Simple helper class to store status of a candidate that is/has been/will be executed - */ -@AllArgsConstructor -@Data -public class CandidateInfo { - - public CandidateInfo() { - //No arg constructor for Jackson - } - - private int index; - private CandidateStatus candidateStatus; - private Double score; - private long createdTime; - private Long startTime; - private Long endTime; - private double[] flatParams; //Same as parameters in Candidate class - private String exceptionStackTrace; -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/CandidateStatus.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/CandidateStatus.java deleted file mode 100644 index 45d490a82..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/CandidateStatus.java +++ /dev/null @@ -1,26 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner; - -/** - * Status for candidates - */ -public enum CandidateStatus { - Created, Running, Complete, Failed, Cancelled -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/IOptimizationRunner.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/IOptimizationRunner.java deleted file mode 100644 index 9511720b1..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/IOptimizationRunner.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner; - -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; - -import java.util.List; - -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -public interface IOptimizationRunner { - - void execute(); - - /** Total number of candidates: created (scheduled), completed and failed */ - int numCandidatesTotal(); - - int numCandidatesCompleted(); - - int numCandidatesFailed(); - - /** Number of candidates running or queued */ - int numCandidatesQueued(); - - /** Best score found so far */ - Double bestScore(); - - /** Time that the best score was found at, or 0 if no jobs have completed successfully */ - Long bestScoreTime(); - - /** Index of the best scoring candidate, or -1 if no candidate has scored yet*/ - int bestScoreCandidateIndex(); - - List getResults(); - - OptimizationConfiguration getConfiguration(); - - void addListeners(StatusListener... listeners); - - void removeListeners(StatusListener... listeners); - - void removeAllListeners(); - - List getCandidateStatus(); - - /** - * @param awaitCompletion If true: await completion of currently scheduled tasks. If false: shutdown immediately, - * cancelling any currently executing tasks - */ - void shutdown(boolean awaitCompletion); -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/LocalOptimizationRunner.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/LocalOptimizationRunner.java deleted file mode 100644 index 24a7546c5..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/LocalOptimizationRunner.java +++ /dev/null @@ -1,152 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner; - -import org.nd4j.shade.guava.util.concurrent.ListenableFuture; -import org.nd4j.shade.guava.util.concurrent.ListeningExecutorService; -import org.nd4j.shade.guava.util.concurrent.MoreExecutors; -import lombok.Setter; -import org.deeplearning4j.arbiter.optimize.api.*; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; - -import java.util.ArrayList; -import java.util.Collections; -import java.util.List; -import java.util.Properties; -import java.util.concurrent.*; -import java.util.concurrent.atomic.AtomicLong; - -/** - * LocalOptimizationRunner: execute hyperparameter optimization - * locally (on current machine, in current JVM). - * - * @author Alex Black - */ -public class LocalOptimizationRunner extends BaseOptimizationRunner { - - public static final int DEFAULT_MAX_CONCURRENT_TASKS = 1; - - private final int maxConcurrentTasks; - - private TaskCreator taskCreator; - private ListeningExecutorService executor; - @Setter - private long shutdownMaxWaitMS = 2L * 24 * 60 * 60 * 1000; - - public LocalOptimizationRunner(OptimizationConfiguration config){ - this(config, null); - } - - public LocalOptimizationRunner(OptimizationConfiguration config, TaskCreator taskCreator) { - this(DEFAULT_MAX_CONCURRENT_TASKS, config, taskCreator); - } - - public LocalOptimizationRunner(int maxConcurrentTasks, OptimizationConfiguration config){ - this(maxConcurrentTasks, config, null); - } - - public LocalOptimizationRunner(int maxConcurrentTasks, OptimizationConfiguration config, TaskCreator taskCreator) { - super(config); - if (maxConcurrentTasks <= 0) - throw new IllegalArgumentException("maxConcurrentTasks must be > 0 (got: " + maxConcurrentTasks + ")"); - this.maxConcurrentTasks = maxConcurrentTasks; - - if(taskCreator == null){ - Class psClass = config.getCandidateGenerator().getParameterSpace().getClass(); - taskCreator = TaskCreatorProvider.defaultTaskCreatorFor(psClass); - if(taskCreator == null){ - throw new IllegalStateException("No TaskCreator was provided and a default TaskCreator cannot be " + - "inferred for ParameterSpace class " + psClass.getName() + ". Please provide a TaskCreator " + - "via the LocalOptimizationRunner constructor"); - } - } - - this.taskCreator = taskCreator; - - ExecutorService exec = Executors.newFixedThreadPool(maxConcurrentTasks, new ThreadFactory() { - private AtomicLong counter = new AtomicLong(0); - - @Override - public Thread newThread(Runnable r) { - Thread t = Executors.defaultThreadFactory().newThread(r); - t.setDaemon(true); - t.setName("LocalCandidateExecutor-" + counter.getAndIncrement()); - return t; - } - }); - executor = MoreExecutors.listeningDecorator(exec); - - init(); - } - - @Override - protected int maxConcurrentTasks() { - return maxConcurrentTasks; - } - - @Override - protected ListenableFuture execute(Candidate candidate, DataProvider dataProvider, - ScoreFunction scoreFunction) { - return execute(Collections.singletonList(candidate), dataProvider, scoreFunction).get(0); - } - - @Override - protected List> execute(List candidates, DataProvider dataProvider, - ScoreFunction scoreFunction) { - List> list = new ArrayList<>(candidates.size()); - for (Candidate candidate : candidates) { - Callable task = - taskCreator.create(candidate, dataProvider, scoreFunction, statusListeners, this); - list.add(executor.submit(task)); - } - return list; - } - - @Override - protected ListenableFuture execute(Candidate candidate, Class dataSource, Properties dataSourceProperties, ScoreFunction scoreFunction) { - return execute(Collections.singletonList(candidate), dataSource, dataSourceProperties, scoreFunction).get(0); - } - - @Override - protected List> execute(List candidates, Class dataSource, Properties dataSourceProperties, ScoreFunction scoreFunction) { - List> list = new ArrayList<>(candidates.size()); - for (Candidate candidate : candidates) { - Callable task = taskCreator.create(candidate, dataSource, dataSourceProperties, scoreFunction, statusListeners, this); - list.add(executor.submit(task)); - } - return list; - } - - @Override - public void shutdown(boolean awaitTermination) { - if(awaitTermination){ - try { - executor.shutdown(); - executor.awaitTermination(shutdownMaxWaitMS, TimeUnit.MILLISECONDS); - } catch (InterruptedException e){ - throw new RuntimeException(e); - } - } else { - executor.shutdownNow(); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/BaseStatusListener.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/BaseStatusListener.java deleted file mode 100644 index 11c101483..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/BaseStatusListener.java +++ /dev/null @@ -1,56 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner.listener; - -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; - -/** - * BaseStatusListener: implements all methods of {@link StatusListener} as no-op. - * Users can extend this and override only the methods actually required - * - * @author Alex Black - */ -public abstract class BaseStatusListener implements StatusListener{ - @Override - public void onInitialization(IOptimizationRunner runner) { - //No op - } - - @Override - public void onShutdown(IOptimizationRunner runner) { - //No op - } - - @Override - public void onRunnerStatusChange(IOptimizationRunner runner) { - //No op - } - - @Override - public void onCandidateStatusChange(CandidateInfo candidateInfo, IOptimizationRunner runner, OptimizationResult result) { - //No op - } - - @Override - public void onCandidateIteration(CandidateInfo candidateInfo, Object candidate, int iteration) { - //No op - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/StatusChangeType.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/StatusChangeType.java deleted file mode 100644 index ea19d59dd..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/StatusChangeType.java +++ /dev/null @@ -1,28 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner.listener; - -/** - * Created by Alex on 20/07/2017. - */ -public enum StatusChangeType { - - CandidateCompleted, CandidateFailed, CandidateNewScheduled, CandidateNewBestScore - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/StatusListener.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/StatusListener.java deleted file mode 100644 index 256c8a2df..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/StatusListener.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner.listener; - -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; - -/** - * The status Listener interface is used to inspect/track the status of execution, both for individual candidates, - * and for the optimisation runner overall. - * - * @author Alex Black - */ -public interface StatusListener { - - /** Called when optimization runner starts execution */ - void onInitialization(IOptimizationRunner runner); - - /** Called when optimization runner terminates */ - void onShutdown(IOptimizationRunner runner); - - /** Called when any of the summary stats change, for the optimization runner: - * number scheduled, number completed, number failed, best score, etc. */ - void onRunnerStatusChange(IOptimizationRunner runner); - - /** - * Called when the status of the candidate is change. For example created, completed, failed. - * - * @param candidateInfo Candidate information - * @param runner Optimisation runner calling this method - * @param result Optimisation result. Maybe null. - */ - void onCandidateStatusChange(CandidateInfo candidateInfo, IOptimizationRunner runner, OptimizationResult result); - - /** - * This method may be called by tasks as they are executing. The intent of this method is to report partial results, - * such as different stages of learning, or scores/evaluations so far - * - * @param candidateInfo Candidate information - * @param candidate Current candidate value/configuration - * @param iteration Current iteration number - */ - void onCandidateIteration(CandidateInfo candidateInfo, Object candidate, int iteration); - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/impl/LoggingStatusListener.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/impl/LoggingStatusListener.java deleted file mode 100644 index 1b82b9b1a..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/runner/listener/impl/LoggingStatusListener.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.runner.listener.impl; - -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; - -/** - * Created by Alex on 20/07/2017. - */ -@Slf4j -public class LoggingStatusListener implements StatusListener { - - - @Override - public void onInitialization(IOptimizationRunner runner) { - log.info("Optimization runner: initialized"); - } - - @Override - public void onShutdown(IOptimizationRunner runner) { - log.info("Optimization runner: shut down"); - } - - @Override - public void onRunnerStatusChange(IOptimizationRunner runner) { - log.info("Optimization runner: status change"); - } - - @Override - public void onCandidateStatusChange(CandidateInfo candidateInfo, IOptimizationRunner runner, - OptimizationResult result) { - log.info("Candidate status change: {}", candidateInfo); - } - - @Override - public void onCandidateIteration(CandidateInfo candidateInfo, Object candidate, int iteration) { - log.info("Candidate iteration #{} - {}", iteration, candidate); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/FixedValueDeserializer.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/FixedValueDeserializer.java deleted file mode 100644 index abb668cd4..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/FixedValueDeserializer.java +++ /dev/null @@ -1,70 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.serde.jackson; - -import org.apache.commons.codec.binary.Base64; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.nd4j.shade.jackson.core.JsonParser; -import org.nd4j.shade.jackson.core.JsonProcessingException; -import org.nd4j.shade.jackson.databind.DeserializationContext; -import org.nd4j.shade.jackson.databind.JsonDeserializer; -import org.nd4j.shade.jackson.databind.JsonNode; -import org.nd4j.shade.jackson.databind.ObjectMapper; - -import java.io.ByteArrayInputStream; -import java.io.IOException; -import java.io.ObjectInputStream; - -/** - * A custom deserializer to be used in conjunction with {@link FixedValueSerializer} - * @author Alex Black - */ -public class FixedValueDeserializer extends JsonDeserializer { - @Override - public FixedValue deserialize(JsonParser p, DeserializationContext deserializationContext) throws IOException, JsonProcessingException { - JsonNode node = p.getCodec().readTree(p); - String className = node.get("@valueclass").asText(); - Class c; - try { - c = Class.forName(className); - } catch (Exception e) { - throw new RuntimeException(e); - } - - if(node.has("value")){ - //Number, String, Enum - JsonNode valueNode = node.get("value"); - Object o = new ObjectMapper().treeToValue(valueNode, c); - return new FixedValue<>(o); - } else { - //Everything else - JsonNode valueNode = node.get("data"); - String data = valueNode.asText(); - - byte[] b = new Base64().decode(data); - ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(b)); - try { - Object o = ois.readObject(); - return new FixedValue<>(o); - } catch (Throwable t) { - throw new RuntimeException(t); - } - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/FixedValueSerializer.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/FixedValueSerializer.java deleted file mode 100644 index ffb5c3524..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/FixedValueSerializer.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.serde.jackson; - -import org.apache.commons.net.util.Base64; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.nd4j.shade.jackson.core.JsonGenerator; -import org.nd4j.shade.jackson.core.type.WritableTypeId; -import org.nd4j.shade.jackson.databind.JsonSerializer; -import org.nd4j.shade.jackson.databind.SerializerProvider; -import org.nd4j.shade.jackson.databind.jsontype.TypeSerializer; - -import java.io.ByteArrayOutputStream; -import java.io.IOException; -import java.io.ObjectOutputStream; - -import static org.nd4j.shade.jackson.core.JsonToken.START_OBJECT; - -/** - * A custom serializer to handle arbitrary object types - * Uses standard JSON where safe (number, string, enumerations) or Java object serialization (bytes -> base64) - * The latter is not an ideal approach, but Jackson doesn't support serialization/deserialization of arbitrary - * objects very well - * - * @author Alex Black - */ -public class FixedValueSerializer extends JsonSerializer { - @Override - public void serialize(FixedValue fixedValue, JsonGenerator j, SerializerProvider serializerProvider) throws IOException { - Object o = fixedValue.getValue(); - - j.writeStringField("@valueclass", o.getClass().getName()); - if(o instanceof Number || o instanceof String || o instanceof Enum){ - j.writeObjectField("value", o); - } else { - ByteArrayOutputStream baos = new ByteArrayOutputStream(); - ObjectOutputStream oos = new ObjectOutputStream(baos); - oos.writeObject(o); - baos.close(); - byte[] b = baos.toByteArray(); - String base64 = new Base64().encodeToString(b); - j.writeStringField("data", base64); - } - } - - @Override - public void serializeWithType(FixedValue value, JsonGenerator gen, SerializerProvider serializers, TypeSerializer typeSer) throws IOException { - WritableTypeId typeId = typeSer.typeId(value, START_OBJECT); - typeSer.writeTypePrefix(gen, typeId); - serialize(value, gen, serializers); - typeSer.writeTypeSuffix(gen, typeId); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/IntegerDistributionDeserializer.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/IntegerDistributionDeserializer.java deleted file mode 100644 index 2c946e845..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/IntegerDistributionDeserializer.java +++ /dev/null @@ -1,61 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.serde.jackson; - -import org.apache.commons.math3.distribution.*; -import org.nd4j.shade.jackson.core.JsonParser; -import org.nd4j.shade.jackson.databind.DeserializationContext; -import org.nd4j.shade.jackson.databind.JsonDeserializer; -import org.nd4j.shade.jackson.databind.JsonNode; - -import java.io.IOException; - -/** - * Custom Jackson deserializer for integer distributions - * - * @author Alex Black - */ -public class IntegerDistributionDeserializer extends JsonDeserializer { - - @Override - public IntegerDistribution deserialize(JsonParser p, DeserializationContext ctxt) throws IOException { - JsonNode node = p.getCodec().readTree(p); - String simpleName = node.get("distribution").asText(); - - switch (simpleName) { - case "BinomialDistribution": - return new BinomialDistribution(node.get("trials").asInt(), node.get("p").asDouble()); - case "GeometricDistribution": - return new GeometricDistribution(node.get("p").asDouble()); - case "HypergeometricDistribution": - return new HypergeometricDistribution(node.get("populationSize").asInt(), - node.get("numberOfSuccesses").asInt(), node.get("sampleSize").asInt()); - case "PascalDistribution": - return new PascalDistribution(node.get("r").asInt(), node.get("p").asDouble()); - case "PoissonDistribution": - return new PoissonDistribution(node.get("p").asDouble()); - case "UniformIntegerDistribution": - return new UniformIntegerDistribution(node.get("lower").asInt(), node.get("upper").asInt()); - case "ZipfDistribution": - return new ZipfDistribution(node.get("numElements").asInt(), node.get("exponent").asDouble()); - default: - throw new RuntimeException("Unknown or not supported distribution: " + simpleName); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/IntegerDistributionSerializer.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/IntegerDistributionSerializer.java deleted file mode 100644 index 6ffe98b9e..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/IntegerDistributionSerializer.java +++ /dev/null @@ -1,76 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.serde.jackson; - -import org.apache.commons.math3.distribution.*; -import org.nd4j.shade.jackson.core.JsonGenerator; -import org.nd4j.shade.jackson.databind.JsonSerializer; -import org.nd4j.shade.jackson.databind.SerializerProvider; - -import java.io.IOException; - -/** - * Custom Jackson serializer for integer distributions - * - * @author Alex Black - */ -public class IntegerDistributionSerializer extends JsonSerializer { - @Override - public void serialize(IntegerDistribution d, JsonGenerator j, SerializerProvider serializerProvider) - throws IOException { - Class c = d.getClass(); - String s = c.getSimpleName(); - - j.writeStartObject(); - j.writeStringField("distribution", s); - - if (c == BinomialDistribution.class) { - BinomialDistribution bd = (BinomialDistribution) d; - j.writeNumberField("trials", bd.getNumberOfTrials()); - j.writeNumberField("p", bd.getProbabilityOfSuccess()); - } else if (c == GeometricDistribution.class) { - GeometricDistribution gd = (GeometricDistribution) d; - j.writeNumberField("p", gd.getProbabilityOfSuccess()); - } else if (c == HypergeometricDistribution.class) { - HypergeometricDistribution hd = (HypergeometricDistribution) d; - j.writeNumberField("populationSize", hd.getPopulationSize()); - j.writeNumberField("numberOfSuccesses", hd.getNumberOfSuccesses()); - j.writeNumberField("sampleSize", hd.getSampleSize()); - } else if (c == PascalDistribution.class) { - PascalDistribution pd = (PascalDistribution) d; - j.writeNumberField("r", pd.getNumberOfSuccesses()); - j.writeNumberField("p", pd.getProbabilityOfSuccess()); - } else if (c == PoissonDistribution.class) { - PoissonDistribution pd = (PoissonDistribution) d; - j.writeNumberField("p", pd.getMean()); - } else if (c == UniformIntegerDistribution.class) { - UniformIntegerDistribution ud = (UniformIntegerDistribution) d; - j.writeNumberField("lower", ud.getSupportLowerBound()); - j.writeNumberField("upper", ud.getSupportUpperBound()); - } else if (c == ZipfDistribution.class) { - ZipfDistribution zd = (ZipfDistribution) d; - j.writeNumberField("numElements", zd.getNumberOfElements()); - j.writeNumberField("exponent", zd.getExponent()); - } else { - throw new UnsupportedOperationException("Unknown or not supported IntegerDistribution: " + c); - } - - j.writeEndObject(); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/JsonMapper.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/JsonMapper.java deleted file mode 100644 index cd62244dc..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/JsonMapper.java +++ /dev/null @@ -1,76 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.serde.jackson; - -import org.nd4j.shade.jackson.annotation.JsonAutoDetect; -import org.nd4j.shade.jackson.annotation.PropertyAccessor; -import org.nd4j.shade.jackson.databind.DeserializationFeature; -import org.nd4j.shade.jackson.databind.ObjectMapper; -import org.nd4j.shade.jackson.databind.SerializationFeature; -import org.nd4j.shade.jackson.dataformat.yaml.YAMLFactory; -import org.nd4j.shade.jackson.datatype.joda.JodaModule; - -/** - * Created by Alex on 16/11/2016. - */ -public class JsonMapper { - - private static ObjectMapper mapper; - private static ObjectMapper yamlMapper; - - static { - mapper = new ObjectMapper(); - mapper.registerModule(new JodaModule()); - mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false); - mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false); - mapper.enable(SerializationFeature.INDENT_OUTPUT); - mapper.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.NONE); - mapper.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY); - mapper.setVisibility(PropertyAccessor.CREATOR, JsonAutoDetect.Visibility.ANY); - mapper.setVisibility(PropertyAccessor.SETTER, JsonAutoDetect.Visibility.ANY); - yamlMapper = new ObjectMapper(new YAMLFactory()); - yamlMapper.registerModule(new JodaModule()); - yamlMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false); - yamlMapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false); - yamlMapper.enable(SerializationFeature.INDENT_OUTPUT); - yamlMapper.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.NONE); - yamlMapper.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY); - yamlMapper.setVisibility(PropertyAccessor.CREATOR, JsonAutoDetect.Visibility.ANY); - } - - private JsonMapper() {} - - - /** - * Return the yaml mapper - * @return - */ - public static ObjectMapper getYamlMapper() { - return yamlMapper; - } - - /** - * Return a json mapper - * @return - */ - public static ObjectMapper getMapper() { - return mapper; - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/RealDistributionDeserializer.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/RealDistributionDeserializer.java deleted file mode 100644 index a69eb4eda..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/RealDistributionDeserializer.java +++ /dev/null @@ -1,80 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.serde.jackson; - -import org.apache.commons.math3.distribution.*; -import org.deeplearning4j.arbiter.optimize.distribution.LogUniformDistribution; -import org.nd4j.shade.jackson.core.JsonParser; -import org.nd4j.shade.jackson.core.JsonProcessingException; -import org.nd4j.shade.jackson.databind.DeserializationContext; -import org.nd4j.shade.jackson.databind.JsonDeserializer; -import org.nd4j.shade.jackson.databind.JsonNode; - -import java.io.IOException; - -/** - * Created by Alex on 14/02/2017. - */ -public class RealDistributionDeserializer extends JsonDeserializer { - - @Override - public RealDistribution deserialize(JsonParser p, DeserializationContext ctxt) - throws IOException, JsonProcessingException { - JsonNode node = p.getCodec().readTree(p); - String simpleName = node.get("distribution").asText(); - - switch (simpleName) { - case "BetaDistribution": - return new BetaDistribution(node.get("alpha").asDouble(), node.get("beta").asDouble()); - case "CauchyDistribution": - return new CauchyDistribution(node.get("median").asDouble(), node.get("scale").asDouble()); - case "ChiSquaredDistribution": - return new ChiSquaredDistribution(node.get("dof").asDouble()); - case "ExponentialDistribution": - return new ExponentialDistribution(node.get("mean").asDouble()); - case "FDistribution": - return new FDistribution(node.get("numeratorDof").asDouble(), node.get("denominatorDof").asDouble()); - case "GammaDistribution": - return new GammaDistribution(node.get("shape").asDouble(), node.get("scale").asDouble()); - case "LevyDistribution": - return new LevyDistribution(node.get("mu").asDouble(), node.get("c").asDouble()); - case "LogNormalDistribution": - return new LogNormalDistribution(node.get("scale").asDouble(), node.get("shape").asDouble()); - case "NormalDistribution": - return new NormalDistribution(node.get("mean").asDouble(), node.get("stdev").asDouble()); - case "ParetoDistribution": - return new ParetoDistribution(node.get("scale").asDouble(), node.get("shape").asDouble()); - case "TDistribution": - return new TDistribution(node.get("dof").asDouble()); - case "TriangularDistribution": - return new TriangularDistribution(node.get("a").asDouble(), node.get("b").asDouble(), - node.get("c").asDouble()); - case "UniformRealDistribution": - return new UniformRealDistribution(node.get("lower").asDouble(), node.get("upper").asDouble()); - case "WeibullDistribution": - return new WeibullDistribution(node.get("alpha").asDouble(), node.get("beta").asDouble()); - case "LogUniformDistribution": - return new LogUniformDistribution(node.get("min").asDouble(), node.get("max").asDouble()); - default: - throw new RuntimeException("Unknown or not supported distribution: " + simpleName); - } - - - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/RealDistributionSerializer.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/RealDistributionSerializer.java deleted file mode 100644 index c57d1e388..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/RealDistributionSerializer.java +++ /dev/null @@ -1,109 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.serde.jackson; - -import org.apache.commons.math3.distribution.*; -import org.deeplearning4j.arbiter.optimize.distribution.LogUniformDistribution; -import org.nd4j.shade.jackson.core.JsonGenerator; -import org.nd4j.shade.jackson.databind.JsonSerializer; -import org.nd4j.shade.jackson.databind.SerializerProvider; - -import java.io.IOException; - -/** - * Custom JSON serializer for Apache commons RealDistribution instances. - * The custom serializer is set up to use the built-in c - */ -public class RealDistributionSerializer extends JsonSerializer { - - @Override - public void serialize(RealDistribution d, JsonGenerator j, SerializerProvider serializerProvider) - throws IOException { - Class c = d.getClass(); - String s = c.getSimpleName(); - - j.writeStartObject(); - j.writeStringField("distribution", s); - - - if (c == BetaDistribution.class) { - BetaDistribution bd = (BetaDistribution) d; - j.writeNumberField("alpha", bd.getAlpha()); - j.writeNumberField("beta", bd.getBeta()); - } else if (c == CauchyDistribution.class) { - CauchyDistribution cd = (CauchyDistribution) d; - j.writeNumberField("median", cd.getMedian()); - j.writeNumberField("scale", cd.getScale()); - } else if (c == ChiSquaredDistribution.class) { - ChiSquaredDistribution cd = (ChiSquaredDistribution) d; - j.writeNumberField("dof", cd.getDegreesOfFreedom()); - } else if (c == ExponentialDistribution.class) { - ExponentialDistribution ed = (ExponentialDistribution) d; - j.writeNumberField("mean", ed.getMean()); - } else if (c == FDistribution.class) { - FDistribution fd = (FDistribution) d; - j.writeNumberField("numeratorDof", fd.getNumeratorDegreesOfFreedom()); - j.writeNumberField("denominatorDof", fd.getDenominatorDegreesOfFreedom()); - } else if (c == GammaDistribution.class) { - GammaDistribution gd = (GammaDistribution) d; - j.writeNumberField("shape", gd.getShape()); - j.writeNumberField("scale", gd.getScale()); - } else if (c == LevyDistribution.class) { - LevyDistribution ld = (LevyDistribution) d; - j.writeNumberField("mu", ld.getLocation()); - j.writeNumberField("c", ld.getScale()); - } else if (c == LogNormalDistribution.class) { - LogNormalDistribution ln = (LogNormalDistribution) d; - j.writeNumberField("scale", ln.getScale()); - j.writeNumberField("shape", ln.getShape()); - } else if (c == NormalDistribution.class) { - NormalDistribution nd = (NormalDistribution) d; - j.writeNumberField("mean", nd.getMean()); - j.writeNumberField("stdev", nd.getStandardDeviation()); - } else if (c == ParetoDistribution.class) { - ParetoDistribution pd = (ParetoDistribution) d; - j.writeNumberField("scale", pd.getScale()); - j.writeNumberField("shape", pd.getShape()); - } else if (c == TDistribution.class) { - TDistribution td = (TDistribution) d; - j.writeNumberField("dof", td.getDegreesOfFreedom()); - } else if (c == TriangularDistribution.class) { - TriangularDistribution td = (TriangularDistribution) d; - j.writeNumberField("a", td.getSupportLowerBound()); - j.writeNumberField("b", td.getMode()); - j.writeNumberField("c", td.getSupportUpperBound()); - } else if (c == UniformRealDistribution.class) { - UniformRealDistribution u = (UniformRealDistribution) d; - j.writeNumberField("lower", u.getSupportLowerBound()); - j.writeNumberField("upper", u.getSupportUpperBound()); - } else if (c == WeibullDistribution.class) { - WeibullDistribution wb = (WeibullDistribution) d; - j.writeNumberField("alpha", wb.getShape()); - j.writeNumberField("beta", wb.getScale()); - } else if (c == LogUniformDistribution.class){ - LogUniformDistribution lud = (LogUniformDistribution) d; - j.writeNumberField("min", lud.getMin()); - j.writeNumberField("max", lud.getMax()); - } else { - throw new UnsupportedOperationException("Unknown or not supported RealDistribution: " + d.getClass()); - } - - j.writeEndObject(); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/YamlMapper.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/YamlMapper.java deleted file mode 100644 index 435e97b59..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/optimize/serde/jackson/YamlMapper.java +++ /dev/null @@ -1,54 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.serde.jackson; - -import org.nd4j.shade.jackson.annotation.JsonAutoDetect; -import org.nd4j.shade.jackson.annotation.PropertyAccessor; -import org.nd4j.shade.jackson.databind.DeserializationFeature; -import org.nd4j.shade.jackson.databind.ObjectMapper; -import org.nd4j.shade.jackson.databind.SerializationFeature; -import org.nd4j.shade.jackson.dataformat.yaml.YAMLFactory; -import org.nd4j.shade.jackson.datatype.joda.JodaModule; - -/** - * Created by Alex on 16/11/2016. - */ -public class YamlMapper { - - private static final ObjectMapper mapper; - - static { - mapper = new ObjectMapper(new YAMLFactory()); - mapper.registerModule(new JodaModule()); - mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false); - mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false); - mapper.enable(SerializationFeature.INDENT_OUTPUT); - mapper.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.NONE); - mapper.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY); - mapper.setVisibility(PropertyAccessor.CREATOR, JsonAutoDetect.Visibility.ANY); - } - - - private YamlMapper() {} - - public static ObjectMapper getMapper() { - return mapper; - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/ClassPathResource.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/ClassPathResource.java deleted file mode 100644 index 5a3cf863b..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/ClassPathResource.java +++ /dev/null @@ -1,235 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.util; - -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.io.*; -import java.net.MalformedURLException; -import java.net.URI; -import java.net.URISyntaxException; -import java.net.URL; -import java.util.zip.ZipEntry; -import java.util.zip.ZipFile; - -/** - * Simple utility class used to get access to files at the classpath, or packed into jar. - * Based on Spring ClassPathResource implementation + jar internals access implemented. - * - * - * @author raver119@gmail.com - */ -public class ClassPathResource { - - private String resourceName; - - private static Logger log = LoggerFactory.getLogger(ClassPathResource.class); - - /** - * Builds new ClassPathResource object - * - * @param resourceName String name of resource, to be retrieved - */ - public ClassPathResource(String resourceName) { - if (resourceName == null) - throw new IllegalStateException("Resource name can't be null"); - this.resourceName = resourceName; - } - - /** - * Returns URL of the requested resource - * - * @return URL of the resource, if it's available in current Jar - */ - private URL getUrl() { - ClassLoader loader = null; - try { - loader = Thread.currentThread().getContextClassLoader(); - } catch (Exception e) { - // do nothing - } - - if (loader == null) { - loader = ClassPathResource.class.getClassLoader(); - } - - URL url = loader.getResource(this.resourceName); - if (url == null) { - // try to check for mis-used starting slash - // TODO: see TODO below - if (this.resourceName.startsWith("/")) { - url = loader.getResource(this.resourceName.replaceFirst("[\\\\/]", "")); - if (url != null) - return url; - } else { - // try to add slash, to make clear it's not an issue - // TODO: change this mechanic to actual path purifier - url = loader.getResource("/" + this.resourceName); - if (url != null) - return url; - } - throw new IllegalStateException("Resource '" + this.resourceName + "' cannot be found."); - } - return url; - } - - /** - * Returns requested ClassPathResource as File object - * - * Please note: if this method called from compiled jar, temporary file will be created to provide File access - * - * @return File requested at constructor call - * @throws FileNotFoundException - */ - public File getFile() throws FileNotFoundException { - URL url = this.getUrl(); - - if (isJarURL(url)) { - /* - This is actually request for file, that's packed into jar. Probably the current one, but that doesn't matters. - */ - try { - url = extractActualUrl(url); - File file = File.createTempFile("canova_temp", "file"); - file.deleteOnExit(); - - ZipFile zipFile = new ZipFile(url.getFile()); - ZipEntry entry = zipFile.getEntry(this.resourceName); - if (entry == null) { - if (this.resourceName.startsWith("/")) { - entry = zipFile.getEntry(this.resourceName.replaceFirst("/", "")); - if (entry == null) { - throw new FileNotFoundException("Resource " + this.resourceName + " not found"); - } - } else - throw new FileNotFoundException("Resource " + this.resourceName + " not found"); - } - - long size = entry.getSize(); - - InputStream stream = zipFile.getInputStream(entry); - FileOutputStream outputStream = new FileOutputStream(file); - byte[] array = new byte[1024]; - int rd = 0; - long bytesRead = 0; - do { - rd = stream.read(array); - outputStream.write(array, 0, rd); - bytesRead += rd; - } while (bytesRead < size); - - outputStream.flush(); - outputStream.close(); - - stream.close(); - zipFile.close(); - - return file; - } catch (Exception e) { - throw new RuntimeException(e); - } - - } else { - /* - It's something in the actual underlying filesystem, so we can just go for it - */ - - try { - URI uri = new URI(url.toString().replaceAll(" ", "%20")); - return new File(uri.getSchemeSpecificPart()); - } catch (URISyntaxException e) { - return new File(url.getFile()); - } - } - } - - /** - * Checks, if proposed URL is packed into archive. - * - * @param url URL to be checked - * @return True, if URL is archive entry, False otherwise - */ - private boolean isJarURL(URL url) { - String protocol = url.getProtocol(); - return "jar".equals(protocol) || "zip".equals(protocol) || "wsjar".equals(protocol) - || "code-source".equals(protocol) && url.getPath().contains("!/"); - } - - /** - * Extracts parent Jar URL from original ClassPath entry URL. - * - * @param jarUrl Original URL of the resource - * @return URL of the Jar file, containing requested resource - * @throws MalformedURLException - */ - private URL extractActualUrl(URL jarUrl) throws MalformedURLException { - String urlFile = jarUrl.getFile(); - int separatorIndex = urlFile.indexOf("!/"); - if (separatorIndex != -1) { - String jarFile = urlFile.substring(0, separatorIndex); - - try { - return new URL(jarFile); - } catch (MalformedURLException var5) { - if (!jarFile.startsWith("/")) { - jarFile = "/" + jarFile; - } - - return new URL("file:" + jarFile); - } - } else { - return jarUrl; - } - } - - /** - * Returns requested ClassPathResource as InputStream object - * - * @return File requested at constructor call - * @throws FileNotFoundException - */ - public InputStream getInputStream() throws FileNotFoundException { - URL url = this.getUrl(); - if (isJarURL(url)) { - try { - url = extractActualUrl(url); - ZipFile zipFile = new ZipFile(url.getFile()); - ZipEntry entry = zipFile.getEntry(this.resourceName); - - if (entry == null) { - if (this.resourceName.startsWith("/")) { - entry = zipFile.getEntry(this.resourceName.replaceFirst("/", "")); - if (entry == null) { - throw new FileNotFoundException("Resource " + this.resourceName + " not found"); - } - } else - throw new FileNotFoundException("Resource " + this.resourceName + " not found"); - } - - return zipFile.getInputStream(entry); - } catch (Exception e) { - throw new RuntimeException(e); - } - } else { - File srcFile = this.getFile(); - return new FileInputStream(srcFile); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/CollectionUtils.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/CollectionUtils.java deleted file mode 100644 index 754547b4a..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/CollectionUtils.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.util; - -import java.util.ArrayList; -import java.util.Collection; -import java.util.HashSet; -import java.util.List; - -public class CollectionUtils { - - /** - * Count the number of unique values in a collection - */ - public static int countUnique(Collection collection) { - HashSet set = new HashSet<>(collection); - return set.size(); - } - - /** - * Returns a list containing only unique values in a collection - */ - public static List getUnique(Collection collection) { - HashSet set = new HashSet<>(); - List out = new ArrayList<>(); - for (T t : collection) { - if (!set.contains(t)) { - out.add(t); - set.add(t); - } - } - return out; - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/LeafUtils.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/LeafUtils.java deleted file mode 100644 index 28155ff8d..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/LeafUtils.java +++ /dev/null @@ -1,76 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.util; - -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; - -import java.util.ArrayList; -import java.util.List; - -/** - * Created by Alex on 29/06/2017. - */ -public class LeafUtils { - - private LeafUtils() {} - - /** - * Returns a list of unique objects, not using the .equals() method, but rather using == - * - * @param allLeaves Leaf values to process - * @return A list of unique parameter space values - */ - public static List getUniqueObjects(List allLeaves) { - List unique = new ArrayList<>(); - for (ParameterSpace p : allLeaves) { - //This isn't especially efficient, but small number of parameters in general means it's fine - boolean found = false; - for (ParameterSpace q : unique) { - if (p == q) { - found = true; - break; - } - } - if (!found) { - unique.add(p); - } - } - - return unique; - } - - /** - * Count the number of unique parameters in the specified leaf nodes - * - * @param allLeaves Leaf values to count the parameters fore - * @return Number of parameters for all unique objects - */ - public static int countUniqueParameters(List allLeaves) { - List unique = getUniqueObjects(allLeaves); - int count = 0; - for (ParameterSpace ps : unique) { - if (!ps.isLeaf()) { - throw new IllegalStateException("Method should only be used with leaf nodes"); - } - count += ps.numParameters(); - } - return count; - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/ObjectUtils.java b/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/ObjectUtils.java deleted file mode 100644 index a48a7900c..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/main/java/org/deeplearning4j/arbiter/util/ObjectUtils.java +++ /dev/null @@ -1,63 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.util; - -import java.util.Arrays; - -/** - * @author Alex Black - */ -public class ObjectUtils { - - private ObjectUtils() {} - - /** - * Get the string representation of the object. Arrays, including primitive arrays, are printed using - * Arrays.toString(...) methods. - * - * @param v Value to convert to a string - * @return String representation - */ - public static String valueToString(Object v) { - if (v.getClass().isArray()) { - if (v.getClass().getComponentType().isPrimitive()) { - Class c = v.getClass().getComponentType(); - if (c == int.class) { - return Arrays.toString((int[]) v); - } else if (c == double.class) { - return Arrays.toString((double[]) v); - } else if (c == float.class) { - return Arrays.toString((float[]) v); - } else if (c == long.class) { - return Arrays.toString((long[]) v); - } else if (c == byte.class) { - return Arrays.toString((byte[]) v); - } else if (c == short.class) { - return Arrays.toString((short[]) v); - } else { - return v.toString(); - } - } else { - return Arrays.toString((Object[]) v); - } - } else { - return v.toString(); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/AssertTestsExtendBaseClass.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/AssertTestsExtendBaseClass.java deleted file mode 100644 index b65f58e71..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/AssertTestsExtendBaseClass.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.arbiter.optimize; - -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.BaseDL4JTest; -import org.nd4j.common.tests.AbstractAssertTestsClass; -import java.util.*; - -/** - * This class checks that all test classes (i.e., anything with one or more methods annotated with @Test) - * extends BaseDl4jTest - either directly or indirectly. - * Other than a small set of exceptions, all tests must extend this - * - * @author Alex Black - */ - -@Slf4j -public class AssertTestsExtendBaseClass extends AbstractAssertTestsClass { - - @Override - protected Set> getExclusions() { - //Set of classes that are exclusions to the rule (either run manually or have their own logging + timeouts) - return new HashSet<>(); - } - - @Override - protected String getPackageName() { - return "org.deeplearning4j.arbiter.optimize"; - } - - @Override - protected Class getBaseClass() { - return BaseDL4JTest.class; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/BraninFunction.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/BraninFunction.java deleted file mode 100644 index acabb2ab3..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/BraninFunction.java +++ /dev/null @@ -1,158 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize; - -import lombok.AllArgsConstructor; -import lombok.Data; -import org.deeplearning4j.arbiter.optimize.api.*; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.CandidateStatus; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; - -import java.io.Serializable; -import java.util.*; -import java.util.concurrent.Callable; - -public class BraninFunction { - public static class BraninSpace extends AbstractParameterSpace { - private int[] indices; - private ParameterSpace first = new ContinuousParameterSpace(-5, 10); - private ParameterSpace second = new ContinuousParameterSpace(0, 15); - - @Override - public BraninConfig getValue(double[] parameterValues) { - double f = first.getValue(parameterValues); - double s = second.getValue(parameterValues); - return new BraninConfig(f, s); //-5 to +10 and 0 to 15 - } - - @Override - public int numParameters() { - return 2; - } - - @Override - public List collectLeaves() { - List list = new ArrayList<>(); - list.addAll(first.collectLeaves()); - list.addAll(second.collectLeaves()); - return list; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - throw new UnsupportedOperationException(); - } - } - - @AllArgsConstructor - @Data - public static class BraninConfig implements Serializable { - private double x1; - private double x2; - } - - public static class BraninScoreFunction implements ScoreFunction { - private static final double a = 1.0; - private static final double b = 5.1 / (4.0 * Math.PI * Math.PI); - private static final double c = 5.0 / Math.PI; - private static final double r = 6.0; - private static final double s = 10.0; - private static final double t = 1.0 / (8.0 * Math.PI); - - @Override - public double score(Object m, DataProvider data, Map dataParameters) { - BraninConfig model = (BraninConfig) m; - double x1 = model.getX1(); - double x2 = model.getX2(); - - return a * Math.pow(x2 - b * x1 * x1 + c * x1 - r, 2.0) + s * (1 - t) * Math.cos(x1) + s; - } - - @Override - public double score(Object model, Class dataSource, Properties dataSourceProperties) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean minimize() { - return true; - } - - @Override - public List> getSupportedModelTypes() { - return Collections.>singletonList(BraninConfig.class); - } - - @Override - public List> getSupportedDataTypes() { - return Collections.>singletonList(Object.class); - } - } - - public static class BraninTaskCreator implements TaskCreator { - @Override - public Callable create(final Candidate c, DataProvider dataProvider, - final ScoreFunction scoreFunction, final List statusListeners, - IOptimizationRunner runner) { - - return new Callable() { - @Override - public OptimizationResult call() throws Exception { - - BraninConfig candidate = (BraninConfig) c.getValue(); - - double score = scoreFunction.score(candidate, null, (Map) null); -// System.out.println(candidate.getX1() + "\t" + candidate.getX2() + "\t" + score); - - Thread.sleep(20); - - if (statusListeners != null) { - for (StatusListener sl : statusListeners) { - sl.onCandidateIteration(null, null, 0); - } - } - - CandidateInfo ci = new CandidateInfo(-1, CandidateStatus.Complete, score, - System.currentTimeMillis(), null, null, null, null); - - return new OptimizationResult(c, score, c.getIndex(), null, ci, null); - } - }; - } - - @Override - public Callable create(Candidate candidate, Class dataSource, - Properties dataSourceProperties, ScoreFunction scoreFunction, - List statusListeners, IOptimizationRunner runner) { - throw new UnsupportedOperationException(); - } - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestGeneticSearch.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestGeneticSearch.java deleted file mode 100644 index 2fb8f8a67..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestGeneticSearch.java +++ /dev/null @@ -1,120 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.TerminationCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.generator.GeneticSearchCandidateGenerator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.exceptions.GeneticGenerationException; -import org.deeplearning4j.arbiter.optimize.generator.genetic.selection.SelectionOperator; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.CandidateStatus; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.impl.LoggingStatusListener; -import org.junit.Assert; -import org.junit.Test; - -public class TestGeneticSearch extends BaseDL4JTest { - public class TestSelectionOperator extends SelectionOperator { - - @Override - public double[] buildNextGenes() { - throw new GeneticGenerationException("Forced exception to test exception handling."); - } - } - - public class TestTerminationCondition implements TerminationCondition { - - public boolean hasAFailedCandidate = false; - public int evalCount = 0; - - @Override - public void initialize(IOptimizationRunner optimizationRunner) {} - - @Override - public boolean terminate(IOptimizationRunner optimizationRunner) { - if (++evalCount == 50) { - // Generator did not handle GeneticGenerationException - return true; - } - - for (CandidateInfo candidateInfo : optimizationRunner.getCandidateStatus()) { - if (candidateInfo.getCandidateStatus() == CandidateStatus.Failed) { - hasAFailedCandidate = true; - return true; - } - } - - return false; - } - } - - @Test - public void GeneticSearchCandidateGenerator_getCandidate_ShouldGenerateCandidates() throws Exception { - - ScoreFunction scoreFunction = new BraninFunction.BraninScoreFunction(); - - //Define configuration: - CandidateGenerator candidateGenerator = - new GeneticSearchCandidateGenerator.Builder(new BraninFunction.BraninSpace(), scoreFunction) - .build(); - - TestTerminationCondition testTerminationCondition = new TestTerminationCondition(); - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).scoreFunction(scoreFunction) - .terminationConditions(new MaxCandidatesCondition(50), testTerminationCondition).build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new BraninFunction.BraninTaskCreator()); - - runner.addListeners(new LoggingStatusListener()); - runner.execute(); - - Assert.assertFalse(testTerminationCondition.hasAFailedCandidate); - } - - @Test - public void GeneticSearchCandidateGenerator_getCandidate_GeneticExceptionShouldMarkCandidateAsFailed() { - - ScoreFunction scoreFunction = new BraninFunction.BraninScoreFunction(); - - //Define configuration: - CandidateGenerator candidateGenerator = - new GeneticSearchCandidateGenerator.Builder(new BraninFunction.BraninSpace(), scoreFunction) - .selectionOperator(new TestSelectionOperator()).build(); - - TestTerminationCondition testTerminationCondition = new TestTerminationCondition(); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).scoreFunction(scoreFunction) - .terminationConditions(testTerminationCondition).build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new BraninFunction.BraninTaskCreator()); - - runner.addListeners(new LoggingStatusListener()); - runner.execute(); - - Assert.assertTrue(testTerminationCondition.hasAFailedCandidate); - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestGridSearch.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestGridSearch.java deleted file mode 100644 index 1f7a2ad2b..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestGridSearch.java +++ /dev/null @@ -1,106 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.generator.GridSearchCandidateGenerator; -import org.junit.Test; - -import java.util.HashMap; -import java.util.Map; - -import static org.junit.Assert.*; - -public class TestGridSearch extends BaseDL4JTest { - - @Test - public void testIndexing() { - int[] nValues = {2, 3}; - int prod = 2 * 3; - double[][] expVals = new double[][] {{0.0, 0.0}, {1.0, 0.0}, {0.0, 0.5}, {1.0, 0.5}, {0.0, 1.0}, {1.0, 1.0}}; - for (int i = 0; i < prod; i++) { - double[] out = GridSearchCandidateGenerator.indexToValues(nValues, i, prod); - double[] exp = expVals[i]; - assertArrayEquals(exp, out, 1e-4); - } - } - - @Test - public void testGeneration() throws Exception { - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, new HashMap<>()); - - //Define configuration: - CandidateGenerator candidateGenerator = new GridSearchCandidateGenerator(new BraninFunction.BraninSpace(), 4, - GridSearchCandidateGenerator.Mode.Sequential, commands); - - //Check sequential: - double[] expValuesFirst = {-5, 0, 5, 10}; //Range: -5 to +10, with 4 values - double[] expValuesSecond = {0, 5, 10, 15}; //Range: 0 to +15, with 4 values - for (int i = 0; i < 4 * 4; i++) { - BraninFunction.BraninConfig conf = (BraninFunction.BraninConfig) candidateGenerator.getCandidate().getValue(); - double expF = expValuesFirst[i % 4]; //Changes most rapidly - double expS = expValuesSecond[i / 4]; - - double actF = conf.getX1(); - double actS = conf.getX2(); - - assertEquals(expF, actF, 1e-4); - assertEquals(expS, actS, 1e-4); - } - - //Check random order. specifically: check that all values are generated, in some order - double[][] orderedOutput = new double[16][2]; - for (int i = 0; i < expValuesFirst.length; i++) { - for (int j = 0; j < expValuesSecond.length; j++) { - orderedOutput[4 * j + i][0] = expValuesFirst[i]; - orderedOutput[4 * j + i][1] = expValuesSecond[j]; - } - } - - - candidateGenerator = new GridSearchCandidateGenerator(new BraninFunction.BraninSpace(), 4, - GridSearchCandidateGenerator.Mode.RandomOrder, commands); - boolean[] seen = new boolean[16]; - int seenCount = 0; - for (int i = 0; i < 4 * 4; i++) { - assertTrue(candidateGenerator.hasMoreCandidates()); - BraninFunction.BraninConfig config = (BraninFunction.BraninConfig) candidateGenerator.getCandidate().getValue(); - double x1 = config.getX1(); - double x2 = config.getX2(); - //Work out which of the values this is... - boolean matched = false; - for (int j = 0; j < 16; j++) { - if (Math.abs(orderedOutput[j][0] - x1) < 1e-5 && Math.abs(orderedOutput[j][1] - x2) < 1e-5) { - matched = true; - if (seen[j]) - fail("Same candidate generated multiple times"); - seen[j] = true; - seenCount++; - break; - } - } - assertTrue("Candidate " + x1 + ", " + x2 + " not found; invalid?", matched); - } - assertFalse(candidateGenerator.hasMoreCandidates()); - assertEquals(16, seenCount); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestJson.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestJson.java deleted file mode 100644 index 38eb09a41..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestJson.java +++ /dev/null @@ -1,124 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize; - -import org.apache.commons.math3.distribution.LogNormalDistribution; -import org.apache.commons.math3.distribution.NormalDistribution; -import org.apache.commons.math3.distribution.UniformIntegerDistribution; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.generator.GridSearchCandidateGenerator; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.parameter.BooleanSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.junit.Test; -import org.nd4j.shade.jackson.annotation.JsonAutoDetect; -import org.nd4j.shade.jackson.annotation.PropertyAccessor; -import org.nd4j.shade.jackson.core.JsonFactory; -import org.nd4j.shade.jackson.databind.DeserializationFeature; -import org.nd4j.shade.jackson.databind.ObjectMapper; -import org.nd4j.shade.jackson.databind.SerializationFeature; -import org.nd4j.shade.jackson.dataformat.yaml.YAMLFactory; -import org.nd4j.shade.jackson.datatype.joda.JodaModule; - -import java.util.ArrayList; -import java.util.HashMap; -import java.util.List; -import java.util.Map; - -import static org.junit.Assert.assertEquals; - -/** - * Created by Alex on 02/02/2017. - */ -public class TestJson extends BaseDL4JTest { - - protected static ObjectMapper getObjectMapper(JsonFactory factory) { - ObjectMapper om = new ObjectMapper(factory); - om.registerModule(new JodaModule()); - om.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false); - om.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false); - om.enable(SerializationFeature.INDENT_OUTPUT); - om.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.NONE); - om.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY); - om.setVisibility(PropertyAccessor.CREATOR, JsonAutoDetect.Visibility.ANY); - return om; - } - - private static ObjectMapper jsonMapper = getObjectMapper(new JsonFactory()); - private static ObjectMapper yamlMapper = getObjectMapper(new YAMLFactory()); - - - @Test - public void testParameterSpaceJson() throws Exception { - - List> l = new ArrayList<>(); - l.add(new FixedValue<>(1.0)); - l.add(new FixedValue<>(1)); - l.add(new FixedValue<>("string")); - l.add(new ContinuousParameterSpace(-1, 1)); - l.add(new ContinuousParameterSpace(new LogNormalDistribution(1, 1))); - l.add(new ContinuousParameterSpace(new NormalDistribution(2, 0.01))); - l.add(new DiscreteParameterSpace<>(1, 5, 7)); - l.add(new DiscreteParameterSpace<>("first", "second", "third")); - l.add(new IntegerParameterSpace(0, 10)); - l.add(new IntegerParameterSpace(new UniformIntegerDistribution(0, 50))); - l.add(new BooleanSpace()); - - for (ParameterSpace ps : l) { - String strJson = jsonMapper.writeValueAsString(ps); - String strYaml = yamlMapper.writeValueAsString(ps); - - ParameterSpace fromJson = jsonMapper.readValue(strJson, ParameterSpace.class); - ParameterSpace fromYaml = yamlMapper.readValue(strYaml, ParameterSpace.class); - - assertEquals(ps, fromJson); - assertEquals(ps, fromYaml); - } - } - - @Test - public void testCandidateGeneratorJson() throws Exception { - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, new HashMap<>()); - - List l = new ArrayList<>(); - l.add(new GridSearchCandidateGenerator(new DiscreteParameterSpace<>(0, 1, 2, 3, 4, 5), 10, - GridSearchCandidateGenerator.Mode.Sequential, commands)); - l.add(new GridSearchCandidateGenerator(new DiscreteParameterSpace<>(0, 1, 2, 3, 4, 5), 10, - GridSearchCandidateGenerator.Mode.RandomOrder, commands)); - l.add(new RandomSearchGenerator(new DiscreteParameterSpace<>(0, 1, 2, 3, 4, 5), commands)); - - for (CandidateGenerator cg : l) { - String strJson = jsonMapper.writeValueAsString(cg); - String strYaml = yamlMapper.writeValueAsString(cg); - - CandidateGenerator fromJson = jsonMapper.readValue(strJson, CandidateGenerator.class); - CandidateGenerator fromYaml = yamlMapper.readValue(strYaml, CandidateGenerator.class); - - assertEquals(cg, fromJson); - assertEquals(cg, fromYaml); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestRandomSearch.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestRandomSearch.java deleted file mode 100644 index 7cd79ccd7..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/TestRandomSearch.java +++ /dev/null @@ -1,63 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.impl.LoggingStatusListener; -import org.junit.Test; - -import java.util.HashMap; -import java.util.Map; - -/** - * - * Test random search on the Branin Function: - * http://www.sfu.ca/~ssurjano/branin.html - */ -public class TestRandomSearch extends BaseDL4JTest { - - @Test - public void test() throws Exception { - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, new HashMap<>()); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(new BraninFunction.BraninSpace(), commands); - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).scoreFunction(new BraninFunction.BraninScoreFunction()) - .terminationConditions(new MaxCandidatesCondition(50)).build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new BraninFunction.BraninTaskCreator()); - - runner.addListeners(new LoggingStatusListener()); - runner.execute(); - - -// System.out.println("----- Complete -----"); - } - - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/distribution/TestLogUniform.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/distribution/TestLogUniform.java deleted file mode 100644 index 46c04f348..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/distribution/TestLogUniform.java +++ /dev/null @@ -1,72 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.distribution; - -import org.apache.commons.math3.distribution.RealDistribution; -import org.deeplearning4j.BaseDL4JTest; -import org.junit.Test; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertTrue; - -public class TestLogUniform extends BaseDL4JTest { - - @Test - public void testSimple(){ - - double min = 0.5; - double max = 3; - - double logMin = Math.log(min); - double logMax = Math.log(max); - - RealDistribution rd = new LogUniformDistribution(min, max); - - for(double d = 0.1; d<= 3.5; d+= 0.1){ - double density = rd.density(d); - double cumulative = rd.cumulativeProbability(d); - double dExp; - double cumExp; - if(d < min){ - dExp = 0; - cumExp = 0; - } else if( d > max){ - dExp = 0; - cumExp = 1; - } else { - dExp = 1.0 / (d * (logMax-logMin)); - cumExp = (Math.log(d) - logMin) / (logMax - logMin); - } - - assertTrue(dExp >= 0); - assertTrue(cumExp >= 0); - assertTrue(cumExp <= 1.0); - assertEquals(dExp, density, 1e-5); - assertEquals(cumExp, cumulative, 1e-5); - } - - rd.reseedRandomGenerator(12345); - for( int i=0; i<100; i++ ){ - double d = rd.sample(); - assertTrue(d >= min); - assertTrue(d <= max); - } - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestCrossoverOperator.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestCrossoverOperator.java deleted file mode 100644 index 972d80089..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestCrossoverOperator.java +++ /dev/null @@ -1,42 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverResult; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; - -public class TestCrossoverOperator extends CrossoverOperator { - - private final CrossoverResult[] results; - private int resultIdx = 0; - - public PopulationModel getPopulationModel() { - return populationModel; - } - - public TestCrossoverOperator(CrossoverResult[] results) { - this.results = results; - } - - @Override - public CrossoverResult crossover() { - return results[resultIdx++]; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestMutationOperator.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestMutationOperator.java deleted file mode 100644 index c3f06201f..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestMutationOperator.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.mutation.MutationOperator; - -public class TestMutationOperator implements MutationOperator { - - private final boolean[] results; - private int resultIdx = 0; - - public TestMutationOperator(boolean[] results) { - this.results = results; - } - - @Override - public boolean mutate(double[] genes) { - return results[resultIdx++]; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestParentSelection.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestParentSelection.java deleted file mode 100644 index e4b8de284..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestParentSelection.java +++ /dev/null @@ -1,54 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.TwoParentSelection; - -import java.util.List; - -public class TestParentSelection extends TwoParentSelection { - - public boolean hasBeenInitialized; - - private final double[][] parents; - - public TestParentSelection(double[][] parents) { - this.parents = parents; - } - - public TestParentSelection() { - this(null); - } - - @Override - public void initializeInstance(List population) { - super.initializeInstance(population); - hasBeenInitialized = true; - } - - @Override - public double[][] selectParents() { - return parents; - } - - public List getPopulation() { - return population; - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestPopulationInitializer.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestPopulationInitializer.java deleted file mode 100644 index 30ec31646..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestPopulationInitializer.java +++ /dev/null @@ -1,32 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic; - -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; - -import java.util.ArrayList; -import java.util.List; - -public class TestPopulationInitializer implements PopulationInitializer { - @Override - public List getInitializedPopulation(int size) { - return new ArrayList<>(); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestRandomGenerator.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestRandomGenerator.java deleted file mode 100644 index 11a8a4318..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/TestRandomGenerator.java +++ /dev/null @@ -1,90 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic; - -import org.apache.commons.lang3.NotImplementedException; -import org.apache.commons.math3.random.RandomGenerator; - -public class TestRandomGenerator implements RandomGenerator { - private final int[] intRandomNumbers; - private int currentIntIdx = 0; - private final double[] doubleRandomNumbers; - private int currentDoubleIdx = 0; - - - public TestRandomGenerator(int[] intRandomNumbers, double[] doubleRandomNumbers) { - this.intRandomNumbers = intRandomNumbers; - this.doubleRandomNumbers = doubleRandomNumbers; - } - - @Override - public void setSeed(int i) { - - } - - @Override - public void setSeed(int[] ints) { - - } - - @Override - public void setSeed(long l) { - - } - - @Override - public void nextBytes(byte[] bytes) { - - } - - @Override - public int nextInt() { - return intRandomNumbers[currentIntIdx++]; - } - - @Override - public int nextInt(int i) { - return intRandomNumbers[currentIntIdx++]; - } - - @Override - public long nextLong() { - throw new NotImplementedException("Not implemented"); - } - - @Override - public boolean nextBoolean() { - throw new NotImplementedException("Not implemented"); - } - - @Override - public float nextFloat() { - throw new NotImplementedException("Not implemented"); - } - - @Override - public double nextDouble() { - return doubleRandomNumbers[currentDoubleIdx++]; - } - - @Override - public double nextGaussian() { - throw new NotImplementedException("Not implemented"); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/ArithmeticCrossoverTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/ArithmeticCrossoverTests.java deleted file mode 100644 index d3b4a967c..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/ArithmeticCrossoverTests.java +++ /dev/null @@ -1,70 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.apache.commons.math3.random.RandomGenerator; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.ArithmeticCrossover; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverResult; -import org.deeplearning4j.arbiter.optimize.genetic.TestParentSelection; -import org.deeplearning4j.arbiter.optimize.genetic.TestRandomGenerator; -import org.junit.Assert; -import org.junit.Test; - -public class ArithmeticCrossoverTests extends BaseDL4JTest { - - @Test - public void ArithmeticCrossover_Crossover_OutsideCrossoverRate_ShouldReturnParent0() { - double[][] parents = new double[2][]; - parents[0] = new double[] {1.0}; - parents[1] = new double[] {2.0}; - - TestParentSelection parentSelection = new TestParentSelection(parents); - - RandomGenerator rng = new TestRandomGenerator(null, new double[] {1.0}); - - ArithmeticCrossover sut = - new ArithmeticCrossover.Builder().parentSelection(parentSelection).randomGenerator(rng).build(); - CrossoverResult result = sut.crossover(); - - Assert.assertFalse(result.isModified()); - Assert.assertEquals(1, result.getGenes().length); - Assert.assertEquals(1.0, result.getGenes()[0], 0.001); - } - - @Test - public void ArithmeticCrossover_Crossover_WithinCrossoverRate_ShouldReturnLinearCombination() { - double[][] parents = new double[2][]; - parents[0] = new double[] {1.0}; - parents[1] = new double[] {2.0}; - - TestParentSelection parentSelection = new TestParentSelection(parents); - - RandomGenerator rng = new TestRandomGenerator(null, new double[] {0.1, 0.1}); - - ArithmeticCrossover sut = - new ArithmeticCrossover.Builder().parentSelection(parentSelection).randomGenerator(rng).build(); - CrossoverResult result = sut.crossover(); - - Assert.assertTrue(result.isModified()); - Assert.assertEquals(1, result.getGenes().length); - Assert.assertEquals(1.9, result.getGenes()[0], 0.001); - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/CrossoverOperatorTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/CrossoverOperatorTests.java deleted file mode 100644 index e04a311db..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/CrossoverOperatorTests.java +++ /dev/null @@ -1,45 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.genetic.TestCrossoverOperator; -import org.deeplearning4j.arbiter.optimize.genetic.TestPopulationInitializer; -import org.junit.Assert; -import org.junit.Test; - -public class CrossoverOperatorTests extends BaseDL4JTest { - - @Test - public void CrossoverOperator_initializeInstance_ShouldInitPopulationModel() throws IllegalAccessException { - TestCrossoverOperator sut = new TestCrossoverOperator(null); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel populationModel = - new PopulationModel.Builder().populationInitializer(populationInitializer).build(); - sut.initializeInstance(populationModel); - - Assert.assertSame(populationModel, sut.getPopulationModel()); - - - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/CrossoverPointsGeneratorTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/CrossoverPointsGeneratorTests.java deleted file mode 100644 index db6670fdd..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/CrossoverPointsGeneratorTests.java +++ /dev/null @@ -1,47 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.apache.commons.math3.random.RandomGenerator; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.utils.CrossoverPointsGenerator; -import org.deeplearning4j.arbiter.optimize.genetic.TestRandomGenerator; -import org.junit.Assert; -import org.junit.Test; - -import java.util.Deque; - -public class CrossoverPointsGeneratorTests extends BaseDL4JTest { - - @Test - public void CrossoverPointsGenerator_FixedNumberCrossovers() { - RandomGenerator rng = new TestRandomGenerator(new int[] {0}, null); - CrossoverPointsGenerator sut = new CrossoverPointsGenerator(10, 2, 2, rng); - - Deque result = sut.getCrossoverPoints(); - - Assert.assertEquals(3, result.size()); - int a = result.pop(); - int b = result.pop(); - int c = result.pop(); - Assert.assertTrue(a < b); - Assert.assertTrue(b < c); - Assert.assertEquals(Integer.MAX_VALUE, c); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/KPointCrossoverTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/KPointCrossoverTests.java deleted file mode 100644 index 9eedaaf29..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/KPointCrossoverTests.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.apache.commons.math3.random.RandomGenerator; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverResult; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.KPointCrossover; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.TwoParentSelection; -import org.deeplearning4j.arbiter.optimize.genetic.TestParentSelection; -import org.deeplearning4j.arbiter.optimize.genetic.TestRandomGenerator; -import org.junit.Assert; -import org.junit.Test; - -public class KPointCrossoverTests extends BaseDL4JTest { - - @Test - public void KPointCrossover_BelowCrossoverRate_ShouldReturnParent0() { - RandomGenerator rng = new TestRandomGenerator(null, new double[] {1.0}); - - double[][] parents = new double[2][]; - parents[0] = new double[] {0.0}; - parents[1] = new double[] {1.0}; - TwoParentSelection parentSelection = new TestParentSelection(parents); - KPointCrossover sut = new KPointCrossover.Builder().randomGenerator(rng).crossoverRate(0.0) - .parentSelection(parentSelection).build(); - - CrossoverResult result = sut.crossover(); - - Assert.assertFalse(result.isModified()); - Assert.assertSame(parents[0], result.getGenes()); - } - - @Test - public void KPointCrossover_FixedNumberOfCrossovers() { - RandomGenerator rng = new TestRandomGenerator(new int[] {0, 1}, new double[] {0.0}); - - double[][] parents = new double[3][]; - parents[0] = new double[] {0.0, 0.0, 0.0, 0.0, 0.0}; - parents[1] = new double[] {1.0, 1.0, 1.0, 1.0, 1.0}; - parents[2] = new double[] {2.0, 2.0, 2.0, 2.0, 2.0}; - TwoParentSelection parentSelection = new TestParentSelection(parents); - KPointCrossover sut = new KPointCrossover.Builder().randomGenerator(rng).crossoverRate(1.0) - .parentSelection(parentSelection).numCrossovers(2).build(); - - CrossoverResult result = sut.crossover(); - - Assert.assertTrue(result.isModified()); - for (double x : result.getGenes()) { - Assert.assertTrue(x == 0.0 || x == 1.0); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/ParentSelectionTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/ParentSelectionTests.java deleted file mode 100644 index 9083fa908..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/ParentSelectionTests.java +++ /dev/null @@ -1,41 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.genetic.TestParentSelection; -import org.junit.Assert; -import org.junit.Test; - -import java.util.ArrayList; -import java.util.List; - -public class ParentSelectionTests extends BaseDL4JTest { - - @Test - public void ParentSelection_InitializeInstance_ShouldInitPopulation() { - TestParentSelection sut = new TestParentSelection(); - - List population = new ArrayList<>(); - sut.initializeInstance(population); - - Assert.assertSame(population, sut.getPopulation()); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/RandomTwoParentSelectionTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/RandomTwoParentSelectionTests.java deleted file mode 100644 index 04c40f606..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/RandomTwoParentSelectionTests.java +++ /dev/null @@ -1,49 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.apache.commons.math3.random.RandomGenerator; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.RandomTwoParentSelection; -import org.deeplearning4j.arbiter.optimize.genetic.TestRandomGenerator; -import org.junit.Assert; -import org.junit.Test; - -import java.util.ArrayList; -import java.util.List; - -public class RandomTwoParentSelectionTests extends BaseDL4JTest { - @Test - public void RandomTwoParentSelection_ShouldReturnTwoDifferentParents() { - RandomGenerator rng = new TestRandomGenerator(new int[] {1, 1, 1, 0}, null); - RandomTwoParentSelection sut = new RandomTwoParentSelection(rng); - - List population = new ArrayList<>(); - population.add(new Chromosome(new double[] {1, 1, 1}, 1.0)); - population.add(new Chromosome(new double[] {2, 2, 2}, 2.0)); - population.add(new Chromosome(new double[] {3, 3, 3}, 3.0)); - sut.initializeInstance(population); - - double[][] result = sut.selectParents(); - - Assert.assertSame(population.get(1).getGenes(), result[0]); - Assert.assertSame(population.get(0).getGenes(), result[1]); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/SinglePointCrossoverTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/SinglePointCrossoverTests.java deleted file mode 100644 index b2d438c11..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/SinglePointCrossoverTests.java +++ /dev/null @@ -1,70 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.apache.commons.math3.random.RandomGenerator; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverResult; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.SinglePointCrossover; -import org.deeplearning4j.arbiter.optimize.genetic.TestParentSelection; -import org.deeplearning4j.arbiter.optimize.genetic.TestRandomGenerator; -import org.junit.Assert; -import org.junit.Test; - -public class SinglePointCrossoverTests extends BaseDL4JTest { - @Test - public void SinglePointCrossover_BelowCrossoverRate_ShouldReturnParent0() { - RandomGenerator rng = new TestRandomGenerator(null, new double[] {1.0}); - - double[][] parents = new double[2][]; - parents[0] = new double[] {1.0, 1.0, 1.0}; - parents[1] = new double[] {2.0, 2.0, 2.0}; - TestParentSelection parentSelection = new TestParentSelection(parents); - - SinglePointCrossover sut = new SinglePointCrossover.Builder().parentSelection(parentSelection) - .randomGenerator(rng).crossoverRate(0.0).build(); - - CrossoverResult result = sut.crossover(); - - Assert.assertFalse(result.isModified()); - Assert.assertSame(parents[0], result.getGenes()); - } - - @Test - public void SinglePointCrossover_ShouldReturnSingleSplit() { - RandomGenerator rng = new TestRandomGenerator(new int[] {2}, new double[] {0.1}); - - double[][] parents = new double[2][]; - parents[0] = new double[] {1.0, 1.0, 1.0}; - parents[1] = new double[] {2.0, 2.0, 2.0}; - TestParentSelection parentSelection = new TestParentSelection(parents); - - SinglePointCrossover sut = new SinglePointCrossover.Builder().parentSelection(parentSelection) - .randomGenerator(rng).crossoverRate(0.5).build(); - - CrossoverResult result = sut.crossover(); - - Assert.assertTrue(result.isModified()); - Assert.assertEquals(1.0, result.getGenes()[0], 0.0); - Assert.assertEquals(1.0, result.getGenes()[1], 0.0); - Assert.assertEquals(2.0, result.getGenes()[2], 0.0); - - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/TwoParentsCrossoverOperatorTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/TwoParentsCrossoverOperatorTests.java deleted file mode 100644 index 6330a7329..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/TwoParentsCrossoverOperatorTests.java +++ /dev/null @@ -1,73 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.apache.commons.lang3.NotImplementedException; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverResult; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.TwoParentsCrossoverOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.parentselection.TwoParentSelection; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.genetic.TestParentSelection; -import org.deeplearning4j.arbiter.optimize.genetic.TestPopulationInitializer; -import org.junit.Assert; -import org.junit.Test; - -public class TwoParentsCrossoverOperatorTests extends BaseDL4JTest { - - class TestTwoParentsCrossoverOperator extends TwoParentsCrossoverOperator { - - public TestTwoParentsCrossoverOperator(TwoParentSelection parentSelection) { - super(parentSelection); - } - - public TwoParentSelection getParentSelection() { - return parentSelection; - } - - @Override - public CrossoverResult crossover() { - throw new NotImplementedException("Not implemented"); - } - } - - @Test - public void TwoParentsCrossoverOperator_ctor_ShouldInitParentSelection() { - TestParentSelection parentSelection = new TestParentSelection(); - TestTwoParentsCrossoverOperator sut = new TestTwoParentsCrossoverOperator(parentSelection); - - Assert.assertSame(parentSelection, sut.getParentSelection()); - } - - @Test - public void TwoParentsCrossoverOperator_initializeInstanceShouldInitializeParentSelection() { - TestParentSelection parentSelection = new TestParentSelection(); - TestTwoParentsCrossoverOperator sut = new TestTwoParentsCrossoverOperator(parentSelection); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - PopulationModel populationModel = - new PopulationModel.Builder().populationInitializer(populationInitializer).build(); - - sut.initializeInstance(populationModel); - - Assert.assertTrue(parentSelection.hasBeenInitialized); - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/UniformCrossoverTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/UniformCrossoverTests.java deleted file mode 100644 index b95416cfa..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/crossover/UniformCrossoverTests.java +++ /dev/null @@ -1,70 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.crossover; - -import org.apache.commons.math3.random.RandomGenerator; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverResult; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.UniformCrossover; -import org.deeplearning4j.arbiter.optimize.genetic.TestParentSelection; -import org.deeplearning4j.arbiter.optimize.genetic.TestRandomGenerator; -import org.junit.Assert; -import org.junit.Test; - -public class UniformCrossoverTests extends BaseDL4JTest { - - @Test - public void UniformCrossover_BelowCrossoverRate_ShouldReturnParent0() { - RandomGenerator rng = new TestRandomGenerator(null, new double[] {1.0}); - - double[][] parents = new double[2][]; - parents[0] = new double[] {1.0, 1.0, 1.0}; - parents[1] = new double[] {2.0, 2.0, 2.0}; - TestParentSelection parentSelection = new TestParentSelection(parents); - - UniformCrossover sut = new UniformCrossover.Builder().parentSelection(parentSelection).randomGenerator(rng) - .crossoverRate(0.0).build(); - - CrossoverResult result = sut.crossover(); - - Assert.assertFalse(result.isModified()); - Assert.assertSame(parents[0], result.getGenes()); - } - - @Test - public void UniformCrossover_ShouldReturnMixedParents() { - RandomGenerator rng = new TestRandomGenerator(null, new double[] {0.1, 0.1, 0.3, 0.2}); - - double[][] parents = new double[2][]; - parents[0] = new double[] {1.0, 1.0, 1.0}; - parents[1] = new double[] {2.0, 2.0, 2.0}; - TestParentSelection parentSelection = new TestParentSelection(parents); - - UniformCrossover sut = new UniformCrossover.Builder().parentSelection(parentSelection).randomGenerator(rng) - .crossoverRate(0.5).parentBiasFactor(0.3).build(); - - CrossoverResult result = sut.crossover(); - - Assert.assertTrue(result.isModified()); - Assert.assertEquals(1.0, result.getGenes()[0], 0.0); - Assert.assertEquals(2.0, result.getGenes()[1], 0.0); - Assert.assertEquals(1.0, result.getGenes()[2], 0.0); - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/culling/LeastFitCullOperatorTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/culling/LeastFitCullOperatorTests.java deleted file mode 100644 index b7d42dcbf..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/culling/LeastFitCullOperatorTests.java +++ /dev/null @@ -1,64 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.culling; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.culling.LeastFitCullOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.genetic.TestPopulationInitializer; -import org.junit.Assert; -import org.junit.Test; - -import java.util.ArrayList; -import java.util.List; - -public class LeastFitCullOperatorTests extends BaseDL4JTest { - - @Test - public void LeastFitCullingOperation_ShouldCullLastElements() { - LeastFitCullOperator sut = new LeastFitCullOperator(0.50); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel populationModel = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(10).build(); - sut.initializeInstance(populationModel); - - List originalChromosomes = new ArrayList<>(); - for (int i = 0; i < 10; ++i) { - originalChromosomes.add(new Chromosome(null, (double) i)); - } - - List chromosomes = populationModel.getPopulation(); - for (int i = 0; i < 10; ++i) { - chromosomes.add(originalChromosomes.get(i)); - } - - sut.cullPopulation(); - - Assert.assertEquals(5, chromosomes.size()); - for (int i = 0; i < 5; ++i) { - Assert.assertSame(originalChromosomes.get(i), chromosomes.get(i)); - } - } - - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/culling/RatioCullOperatorTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/culling/RatioCullOperatorTests.java deleted file mode 100644 index 6f46df124..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/culling/RatioCullOperatorTests.java +++ /dev/null @@ -1,80 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.culling; - -import org.apache.commons.lang3.NotImplementedException; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.culling.RatioCullOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.genetic.TestPopulationInitializer; -import org.junit.Assert; -import org.junit.Test; - -import java.util.List; - -public class RatioCullOperatorTests extends BaseDL4JTest { - - class TestRatioCullOperator extends RatioCullOperator { - - public TestRatioCullOperator() { - super(); - } - - public TestRatioCullOperator(double ratio) { - super(ratio); - } - - public List getPopulation() { - return population; - } - - @Override - public void cullPopulation() { - throw new NotImplementedException("Not implemented"); - } - - public double getCullRatio() { - return cullRatio; - } - } - - @Test - public void RatioCullingOperation_ctorWithCullRatio_ShouldHaveParamRatio() { - TestRatioCullOperator sut = new TestRatioCullOperator(0.123); - - Assert.assertEquals(0.123, sut.getCullRatio(), 0.0); - } - - @Test - public void RatioCullingOperation_initialize_shouldSetCulledSizeAndPopulation() throws IllegalAccessException { - TestRatioCullOperator sut = new TestRatioCullOperator(0.50); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel populationModel = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(10).build(); - sut.initializeInstance(populationModel); - - Assert.assertSame(populationModel.getPopulation(), sut.getPopulation()); - Assert.assertEquals(5, sut.getCulledSize()); - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/mutation/RandomMutationOperatorTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/mutation/RandomMutationOperatorTests.java deleted file mode 100644 index c7150469c..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/mutation/RandomMutationOperatorTests.java +++ /dev/null @@ -1,74 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.mutation; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.mutation.RandomMutationOperator; -import org.deeplearning4j.arbiter.optimize.genetic.TestRandomGenerator; -import org.junit.Assert; -import org.junit.Test; - -import java.lang.reflect.Field; - -public class RandomMutationOperatorTests extends BaseDL4JTest { - @Test - public void RandomMutationOperator_DefaultBuild_ShouldNotBeNull() { - RandomMutationOperator sut = new RandomMutationOperator.Builder().build(); - Assert.assertNotNull(sut); - } - - @Test - public void RandomMutationOperator_BuildWithMutationRate_ShouldUseSuppliedRate() throws Exception { - RandomMutationOperator sut = new RandomMutationOperator.Builder().mutationRate(0.123).build(); - - Field f = sut.getClass().getDeclaredField("mutationRate"); - f.setAccessible(true); - Double mutationRate = (Double) f.get(sut); - - Assert.assertEquals(0.123, mutationRate, 0.0); - } - - @Test - public void RandomMutationOperator_BelowMutationRate_ShouldNotMutate() { - double[] randomNumbers = new double[] {0.1, 1.0, 1.0}; - - RandomMutationOperator sut = new RandomMutationOperator.Builder().mutationRate(0.1) - .randomGenerator(new TestRandomGenerator(null, randomNumbers)).build(); - - double[] genes = new double[] {-1.0, -1.0, -1.0}; - boolean hasMutated = sut.mutate(genes); - - Assert.assertFalse(hasMutated); - Assert.assertArrayEquals(new double[]{-1.0, -1.0, -1.0}, genes, 0.0); - } - - @Test - public void RandomMutationOperator_AboveMutationRate_ShouldMutate() { - double[] randomNumbers = new double[] {0.099, 0.123, 1.0, 1.0}; - - RandomMutationOperator sut = new RandomMutationOperator.Builder().mutationRate(0.1) - .randomGenerator(new TestRandomGenerator(null, randomNumbers)).build(); - - double[] genes = new double[] {-1.0, -1.0, -1.0}; - boolean hasMutated = sut.mutate(genes); - - Assert.assertTrue(hasMutated); - Assert.assertArrayEquals(new double[]{0.123, -1.0, -1.0}, genes, 0.0); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/population/PopulationModelTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/population/PopulationModelTests.java deleted file mode 100644 index 2c0c5d14c..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/population/PopulationModelTests.java +++ /dev/null @@ -1,197 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.population; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.Chromosome; -import org.deeplearning4j.arbiter.optimize.generator.genetic.culling.CullOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationListener; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.genetic.TestPopulationInitializer; -import org.junit.Assert; -import org.junit.Test; - -import java.util.List; - -public class PopulationModelTests extends BaseDL4JTest { - - private class TestCullOperator implements CullOperator { - - private final int culledSize; - public boolean hasCulled = false; - - public TestCullOperator(int culledSize) { - this.culledSize = culledSize; - } - - @Override - public void initializeInstance(PopulationModel populationModel) { - - } - - @Override - public void cullPopulation() { - hasCulled = true; - } - - @Override - public int getCulledSize() { - return culledSize; - } - } - - private class TestPopulationListener implements PopulationListener { - - public List population; - - @Override - public void onChanged(List population) { - this.population = population; - } - } - - @Test - public void PopulationModel_IsReadyToBreed_NotReadyToBreed_ShouldReturnFalse() { - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel sut = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(5).cullOperator(new TestCullOperator(2)).build(); - - boolean result = sut.isReadyToBreed(); - - Assert.assertFalse(result); - } - - @Test - public void PopulationModel_IsReadyToBreed_ReadyToBreed_ShouldReturnTrue() { - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel sut = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(5).cullOperator(new TestCullOperator(1)).build(); - - sut.getPopulation().add(null); - - boolean result = sut.isReadyToBreed(); - - Assert.assertTrue(result); - } - - @Test - public void PopulationModel_Add_MaximizeScore_ShouldOrderDescendingPopulation() { - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel sut = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(5).cullOperator(new TestCullOperator(2)).build(); - - sut.initializeInstance(false); - - Chromosome[] chromosomes = new Chromosome[3]; - chromosomes[0] = new Chromosome(new double[0], 1.0); - chromosomes[1] = new Chromosome(new double[0], 100.0); - chromosomes[2] = new Chromosome(new double[0], 10.0); - sut.add(chromosomes[0]); - sut.add(chromosomes[1]); - sut.add(chromosomes[2]); - - Assert.assertSame(chromosomes[1], sut.getPopulation().get(0)); - Assert.assertSame(chromosomes[2], sut.getPopulation().get(1)); - Assert.assertSame(chromosomes[0], sut.getPopulation().get(2)); - } - - @Test - public void PopulationModel_Add_MinimizeScore_ShouldOrderAscendingPopulation() { - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel sut = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(5).cullOperator(new TestCullOperator(2)).build(); - - sut.initializeInstance(true); - - Chromosome[] chromosomes = new Chromosome[3]; - chromosomes[0] = new Chromosome(new double[0], 100.0); - chromosomes[1] = new Chromosome(new double[0], 1.0); - chromosomes[2] = new Chromosome(new double[0], 10.0); - sut.add(chromosomes[0]); - sut.add(chromosomes[1]); - sut.add(chromosomes[2]); - - Assert.assertSame(chromosomes[1], sut.getPopulation().get(0)); - Assert.assertSame(chromosomes[2], sut.getPopulation().get(1)); - Assert.assertSame(chromosomes[0], sut.getPopulation().get(2)); - } - - @Test - public void PopulationModel_Add_ShouldTriggerPopulationListeners() { - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel sut = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(5).cullOperator(new TestCullOperator(2)).build(); - - sut.initializeInstance(true); - - TestPopulationListener populationListener = new TestPopulationListener(); - sut.addListener(populationListener); - - sut.add(new Chromosome(new double[0], 100.0)); - - Assert.assertSame(sut.getPopulation(), populationListener.population); - } - - @Test - public void PopulationModel_Add_BelowPopulationSize_ShouldNotCull() { - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - TestCullOperator cullOperator = new TestCullOperator(3); - - PopulationModel sut = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(5).cullOperator(cullOperator).build(); - - sut.initializeInstance(true); - - sut.add(new Chromosome(new double[0], 1.0)); - sut.add(new Chromosome(new double[0], 2.0)); - sut.add(new Chromosome(new double[0], 3.0)); - sut.add(new Chromosome(new double[0], 4.0)); - sut.add(new Chromosome(new double[0], 5.0)); - - Assert.assertFalse(cullOperator.hasCulled); - } - - @Test - public void PopulationModel_Add_AbovePopulationSize_ShouldCull() { - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - TestCullOperator cullOperator = new TestCullOperator(3); - - PopulationModel sut = new PopulationModel.Builder().populationInitializer(populationInitializer) - .populationSize(5).cullOperator(cullOperator).build(); - - sut.initializeInstance(true); - - sut.add(new Chromosome(new double[0], 1.0)); - sut.add(new Chromosome(new double[0], 2.0)); - sut.add(new Chromosome(new double[0], 3.0)); - sut.add(new Chromosome(new double[0], 4.0)); - sut.add(new Chromosome(new double[0], 5.0)); - sut.add(new Chromosome(new double[0], 6.0)); - - Assert.assertTrue(cullOperator.hasCulled); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/selection/GeneticSelectionOperatorTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/selection/GeneticSelectionOperatorTests.java deleted file mode 100644 index 51d92496c..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/selection/GeneticSelectionOperatorTests.java +++ /dev/null @@ -1,252 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.selection; - -import org.apache.commons.lang3.NotImplementedException; -import org.apache.commons.math3.random.RandomGenerator; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.ChromosomeFactory; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.crossover.CrossoverResult; -import org.deeplearning4j.arbiter.optimize.generator.genetic.culling.CullOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.exceptions.GeneticGenerationException; -import org.deeplearning4j.arbiter.optimize.generator.genetic.mutation.MutationOperator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.generator.genetic.selection.GeneticSelectionOperator; -import org.deeplearning4j.arbiter.optimize.genetic.TestCrossoverOperator; -import org.deeplearning4j.arbiter.optimize.genetic.TestMutationOperator; -import org.deeplearning4j.arbiter.optimize.genetic.TestPopulationInitializer; -import org.deeplearning4j.arbiter.optimize.genetic.TestRandomGenerator; -import org.junit.Assert; -import org.junit.Test; - -import static org.junit.Assert.assertArrayEquals; - -public class GeneticSelectionOperatorTests extends BaseDL4JTest { - - private class TestCullOperator implements CullOperator { - - private final int culledSize; - - public TestCullOperator(int culledSize) { - - this.culledSize = culledSize; - } - - @Override - public void initializeInstance(PopulationModel populationModel) { - - } - - @Override - public void cullPopulation() { - throw new NotImplementedException("Not implemented"); - } - - @Override - public int getCulledSize() { - return culledSize; - } - } - - private class GeneticSelectionOperatorTestsMutationOperator implements MutationOperator { - - private boolean mutateResult; - - public GeneticSelectionOperatorTestsMutationOperator(boolean mutateResult) { - - this.mutateResult = mutateResult; - } - - @Override - public boolean mutate(double[] genes) { - return mutateResult; - } - } - - private class GeneticSelectionOperatorTestsCrossoverOperator extends CrossoverOperator { - - private CrossoverResult result; - - public GeneticSelectionOperatorTestsCrossoverOperator(CrossoverResult result) { - - this.result = result; - } - - @Override - public CrossoverResult crossover() { - return result; - } - } - - @Test - public void GeneticSelectionOperator_PopulationNotReadyToBreed_ShouldReturnRandomGenes() { - RandomGenerator rng = new TestRandomGenerator(null, new double[] {123.0}); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - TestCullOperator cullOperator = new TestCullOperator(1000); - PopulationModel populationModel = new PopulationModel.Builder().populationInitializer(populationInitializer) - .cullOperator(cullOperator).build(); - ChromosomeFactory chromosomeFactory = new ChromosomeFactory(); - chromosomeFactory.initializeInstance(1); - GeneticSelectionOperator sut = new GeneticSelectionOperator.Builder().randomGenerator(rng).build(); - sut.initializeInstance(populationModel, chromosomeFactory); - - double[] newGenes = sut.buildNextGenes(); - - Assert.assertEquals(1, newGenes.length); - Assert.assertEquals(123.0, newGenes[0], 0.0); - } - - @Test - public void GeneticSelectionOperator_NoModificationOnFirstTry() { - RandomGenerator rng = new TestRandomGenerator(null, new double[] {123.0}); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - TestCullOperator cullOperator = new TestCullOperator(-1); - - PopulationModel populationModel = new PopulationModel.Builder().populationInitializer(populationInitializer) - .cullOperator(cullOperator).build(); - - ChromosomeFactory chromosomeFactory = new ChromosomeFactory(); - chromosomeFactory.initializeInstance(1); - - CrossoverResult[] crossoverResults = new CrossoverResult[2]; - crossoverResults[0] = new CrossoverResult(false, new double[0]); - crossoverResults[1] = new CrossoverResult(true, new double[0]); - TestCrossoverOperator crossoverOperator = new TestCrossoverOperator(crossoverResults); - - boolean[] mutationResults = new boolean[] {false, false}; - TestMutationOperator mutationOperator = new TestMutationOperator(mutationResults); - - GeneticSelectionOperator sut = new GeneticSelectionOperator.Builder().randomGenerator(rng) - .crossoverOperator(crossoverOperator).mutationOperator(mutationOperator).build(); - sut.initializeInstance(populationModel, chromosomeFactory); - - double[] newGenes = sut.buildNextGenes(); - - Assert.assertSame(crossoverResults[1].getGenes(), newGenes); - } - - @Test - public void GeneticSelectionOperator_MutationNoModificationOnFirstTry() { - RandomGenerator rng = new TestRandomGenerator(null, new double[] {123.0}); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - TestCullOperator cullOperator = new TestCullOperator(-1); - - PopulationModel populationModel = new PopulationModel.Builder().populationInitializer(populationInitializer) - .cullOperator(cullOperator).build(); - - ChromosomeFactory chromosomeFactory = new ChromosomeFactory(); - chromosomeFactory.initializeInstance(1); - - CrossoverResult[] crossoverResults = new CrossoverResult[3]; - crossoverResults[0] = new CrossoverResult(false, new double[0]); - crossoverResults[1] = new CrossoverResult(false, new double[0]); - crossoverResults[2] = new CrossoverResult(true, new double[0]); - TestCrossoverOperator crossoverOperator = new TestCrossoverOperator(crossoverResults); - - boolean[] mutationResults = new boolean[] {false, false, true}; - TestMutationOperator mutationOperator = new TestMutationOperator(mutationResults); - - GeneticSelectionOperator sut = new GeneticSelectionOperator.Builder().randomGenerator(rng) - .crossoverOperator(crossoverOperator).mutationOperator(mutationOperator).build(); - sut.initializeInstance(populationModel, chromosomeFactory); - - double[] newGenes = sut.buildNextGenes(); - - Assert.assertSame(crossoverResults[2].getGenes(), newGenes); - } - - @Test - public void GeneticSelectionOperator_ShouldNotBuildDuplicates() { - RandomGenerator rng = new TestRandomGenerator(null, new double[] {123.0}); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - TestCullOperator cullOperator = new TestCullOperator(-1); - - PopulationModel populationModel = new PopulationModel.Builder().populationInitializer(populationInitializer) - .cullOperator(cullOperator).build(); - - ChromosomeFactory chromosomeFactory = new ChromosomeFactory(); - chromosomeFactory.initializeInstance(1); - - CrossoverResult[] crossoverResults = new CrossoverResult[3]; - crossoverResults[0] = new CrossoverResult(true, new double[] {1.0}); - crossoverResults[1] = new CrossoverResult(true, new double[] {1.0}); - crossoverResults[2] = new CrossoverResult(true, new double[] {2.0}); - TestCrossoverOperator crossoverOperator = new TestCrossoverOperator(crossoverResults); - - boolean[] mutationResults = new boolean[] {false, false, false}; - TestMutationOperator mutationOperator = new TestMutationOperator(mutationResults); - - GeneticSelectionOperator sut = new GeneticSelectionOperator.Builder().randomGenerator(rng) - .crossoverOperator(crossoverOperator).mutationOperator(mutationOperator).build(); - sut.initializeInstance(populationModel, chromosomeFactory); - - double[] newGenes = sut.buildNextGenes(); - assertArrayEquals(crossoverResults[0].getGenes(), newGenes, 1e-6); - - newGenes = sut.buildNextGenes(); - assertArrayEquals(crossoverResults[2].getGenes(), newGenes, 1e-6); - } - - @Test(expected = GeneticGenerationException.class) - public void GeneticSelectionOperator_CrossoverAndMutationCantGenerateNew_ShouldThrow() { - TestCullOperator cullOperator = new TestCullOperator(-1); - - PopulationModel populationModel = new PopulationModel.Builder().cullOperator(cullOperator).build(); - - MutationOperator mutationOperator = new GeneticSelectionOperatorTestsMutationOperator(false); - CrossoverOperator crossoverOperator = - new GeneticSelectionOperatorTestsCrossoverOperator(new CrossoverResult(false, null)); - - GeneticSelectionOperator sut = new GeneticSelectionOperator.Builder().crossoverOperator(crossoverOperator) - .mutationOperator(mutationOperator).build(); - sut.initializeInstance(populationModel, null); - - sut.buildNextGenes(); - } - - @Test(expected = GeneticGenerationException.class) - public void GeneticSelectionOperator_CrossoverAndMutationAlwaysGenerateSame_ShouldThrow() { - TestCullOperator cullOperator = new TestCullOperator(-1); - - PopulationModel populationModel = new PopulationModel.Builder().cullOperator(cullOperator).build(); - - MutationOperator mutationOperator = new GeneticSelectionOperatorTestsMutationOperator(false); - CrossoverOperator crossoverOperator = new GeneticSelectionOperatorTestsCrossoverOperator( - new CrossoverResult(true, new double[] {1.0})); - - GeneticSelectionOperator sut = new GeneticSelectionOperator.Builder().crossoverOperator(crossoverOperator) - .mutationOperator(mutationOperator).build(); - sut.initializeInstance(populationModel, null); - - // This call is used to add the genes to the previousGenes collection - sut.buildNextGenes(); - - sut.buildNextGenes(); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/selection/SelectionOperatorTests.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/selection/SelectionOperatorTests.java deleted file mode 100644 index daeb85e61..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/genetic/selection/SelectionOperatorTests.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.genetic.selection; - -import org.apache.commons.lang3.NotImplementedException; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.generator.genetic.ChromosomeFactory; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationInitializer; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.generator.genetic.selection.SelectionOperator; -import org.deeplearning4j.arbiter.optimize.genetic.TestPopulationInitializer; -import org.junit.Assert; -import org.junit.Test; - -public class SelectionOperatorTests extends BaseDL4JTest { - private class TestSelectionOperator extends SelectionOperator { - - public PopulationModel getPopulationModel() { - return populationModel; - } - - public ChromosomeFactory getChromosomeFactory() { - return chromosomeFactory; - } - - @Override - public double[] buildNextGenes() { - throw new NotImplementedException("Not implemented"); - } - } - - @Test - public void SelectionOperator_InitializeInstance_ShouldInitializeFields() { - TestSelectionOperator sut = new TestSelectionOperator(); - - PopulationInitializer populationInitializer = new TestPopulationInitializer(); - - PopulationModel populationModel = - new PopulationModel.Builder().populationInitializer(populationInitializer).build(); - ChromosomeFactory chromosomeFactory = new ChromosomeFactory(); - sut.initializeInstance(populationModel, chromosomeFactory); - - Assert.assertSame(populationModel, sut.getPopulationModel()); - Assert.assertSame(chromosomeFactory, sut.getChromosomeFactory()); - } -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/parameter/TestParameterSpaces.java b/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/parameter/TestParameterSpaces.java deleted file mode 100644 index c10fbcdc3..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/java/org/deeplearning4j/arbiter/optimize/parameter/TestParameterSpaces.java +++ /dev/null @@ -1,105 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize.parameter; - -import org.apache.commons.math3.distribution.NormalDistribution; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.junit.Test; - -import static org.junit.Assert.*; - -public class TestParameterSpaces extends BaseDL4JTest { - - - @Test - public void testContinuousParameterSpace() { - - ContinuousParameterSpace cps = new ContinuousParameterSpace(0, 1); - cps.setIndices(0); - - for (int i = 0; i < 10; i++) { - double d = i / 10.0; - assertEquals(d, cps.getValue(new double[]{d}), 0.0); - } - - cps = new ContinuousParameterSpace(10, 20); - cps.setIndices(0); - - for (int i = 0; i < 10; i++) { - double d = i / 10.0; - double exp = d * 10 + 10; - assertEquals(exp, cps.getValue(new double[]{d}), 0.0); - } - - - cps = new ContinuousParameterSpace(new NormalDistribution(0, 1)); - NormalDistribution nd = new NormalDistribution(0, 1); - cps.setIndices(0); - for (int i = 0; i < 11; i++) { - double d = i / 10.0; - assertEquals(nd.inverseCumulativeProbability(d), cps.getValue(new double[]{d}), 1e-4); - } - } - - @Test - public void testDiscreteParameterSpace() { - ParameterSpace dps = new DiscreteParameterSpace<>(0, 1, 2, 3, 4); - dps.setIndices(0); - - for (int i = 0; i < 5; i++) { - double d = i / 5.0 + 0.1; //Center - double dEdgeLower = i / 5.0 + 1e-8; //Edge case: just above split threshold - double dEdgeUpper = (i + 1) / 5.0 - 1e-8; //Edge case: just below split threshold - assertEquals(i, (int) dps.getValue(new double[]{d})); - assertEquals(i, (int) dps.getValue(new double[]{dEdgeLower})); - assertEquals(i, (int) dps.getValue(new double[]{dEdgeUpper})); - } - } - - @Test - public void testIntegerParameterSpace() { - ParameterSpace ips = new IntegerParameterSpace(0, 4); - ips.setIndices(0); - - for (int i = 0; i < 5; i++) { - double d = i / 5.0 + 0.1; //Center - double dEdgeLower = i / 5.0 + 1e-8; //Edge case: just above split threshold - double dEdgeUpper = (i + 1) / 5.0 - 1e-8; //Edge case: just below split threshold - assertEquals(i, (int) ips.getValue(new double[]{d})); - assertEquals(i, (int) ips.getValue(new double[]{dEdgeLower})); - assertEquals(i, (int) ips.getValue(new double[]{dEdgeUpper})); - } - } - - @Test - public void testBooleanSpace() { - ParameterSpace bSpace = new BooleanSpace(); - bSpace.setIndices(1); //randomly setting to non zero - - assertTrue(bSpace.getValue(new double[]{0.0, 0.0})); - assertTrue(bSpace.getValue(new double[]{0.1, 0.5})); - assertFalse(bSpace.getValue(new double[]{0.2, 0.7})); - assertFalse(bSpace.getValue(new double[]{0.3, 1.0})); - } - -} diff --git a/contrib/attic/arbiter/arbiter-core/src/test/resources/logback.xml b/contrib/attic/arbiter/arbiter-core/src/test/resources/logback.xml deleted file mode 100644 index bc7ffbbb5..000000000 --- a/contrib/attic/arbiter/arbiter-core/src/test/resources/logback.xml +++ /dev/null @@ -1,55 +0,0 @@ - - - - - - logs/application.log - - %date - [%level] - from %logger in %thread - %n%message%n%xException%n - - - - - - %logger{15} - %message%n%xException{5} - - - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/pom.xml b/contrib/attic/arbiter/arbiter-deeplearning4j/pom.xml deleted file mode 100644 index 520cb92a3..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/pom.xml +++ /dev/null @@ -1,66 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - arbiter - 1.0.0-SNAPSHOT - - - arbiter-deeplearning4j - - arbiter-deeplearning4j - - - - org.deeplearning4j - arbiter-core - ${project.version} - - - org.deeplearning4j - deeplearning4j-core - - - org.nd4j - jackson - - - com.google.code.gson - gson - - - - - - test-nd4j-native - - - test-nd4j-cuda-11.0 - - - diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/BaseNetworkSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/BaseNetworkSpace.java deleted file mode 100644 index a1e70bd67..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/BaseNetworkSpace.java +++ /dev/null @@ -1,625 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter; - -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.adapter.ActivationParameterSpaceAdapter; -import org.deeplearning4j.arbiter.conf.dropout.DropoutSpace; -import org.deeplearning4j.arbiter.layers.LayerSpace; -import org.deeplearning4j.arbiter.optimize.api.AbstractParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.deeplearning4j.arbiter.optimize.serde.jackson.YamlMapper; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.nn.api.OptimizationAlgorithm; -import org.deeplearning4j.nn.api.layers.LayerConstraint; -import org.deeplearning4j.nn.conf.BackpropType; -import org.deeplearning4j.nn.conf.ConvolutionMode; -import org.deeplearning4j.nn.conf.GradientNormalization; -import org.deeplearning4j.nn.conf.InputPreProcessor; -import org.deeplearning4j.nn.conf.NeuralNetConfiguration; -import org.deeplearning4j.nn.conf.distribution.Distribution; -import org.deeplearning4j.nn.conf.dropout.Dropout; -import org.deeplearning4j.nn.conf.dropout.IDropout; -import org.deeplearning4j.nn.conf.stepfunctions.StepFunction; -import org.deeplearning4j.nn.conf.weightnoise.IWeightNoise; -import org.deeplearning4j.nn.weights.WeightInit; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; -import org.nd4j.shade.jackson.core.JsonProcessingException; - -import java.util.ArrayList; -import java.util.Arrays; -import java.util.LinkedList; -import java.util.List; -import java.util.Map; - -/** - * This is an abstract ParameterSpace for both MultiLayerNetworks (MultiLayerSpace) and ComputationGraph (ComputationGraphSpace) - *

- * Functionality here should match {@link org.deeplearning4j.nn.conf.NeuralNetConfiguration.Builder} - * - * @param Type of network (MultiLayerNetwork or ComputationGraph) - * @author Alex Black - */ -@EqualsAndHashCode(callSuper = false) -@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "@class") -@Data -public abstract class BaseNetworkSpace extends AbstractParameterSpace { - - protected Long seed; - protected ParameterSpace optimizationAlgo; - protected ParameterSpace activationFunction; - protected ParameterSpace biasInit; - protected ParameterSpace weightInit; - protected ParameterSpace dist; - protected ParameterSpace maxNumLineSearchIterations; - protected ParameterSpace miniBatch; - protected ParameterSpace minimize; - protected ParameterSpace stepFunction; - protected ParameterSpace l1; - protected ParameterSpace l2; - protected ParameterSpace l1Bias; - protected ParameterSpace l2Bias; - protected ParameterSpace updater; - protected ParameterSpace biasUpdater; - protected ParameterSpace weightNoise; - private ParameterSpace dropout; - protected ParameterSpace gradientNormalization; - protected ParameterSpace gradientNormalizationThreshold; - protected ParameterSpace convolutionMode; - - protected List layerSpaces = new ArrayList<>(); - - //NeuralNetConfiguration.ListBuilder/MultiLayerConfiguration.Builder options: - protected ParameterSpace backpropType; - protected ParameterSpace tbpttFwdLength; - protected ParameterSpace tbpttBwdLength; - - protected ParameterSpace> allParamConstraints; - protected ParameterSpace> weightConstraints; - protected ParameterSpace> biasConstraints; - - protected int numEpochs = 1; - - - static { - JsonMapper.getMapper().registerSubtypes(ComputationGraphSpace.class, MultiLayerSpace.class); - YamlMapper.getMapper().registerSubtypes(ComputationGraphSpace.class, MultiLayerSpace.class); - } - - @SuppressWarnings("unchecked") - protected BaseNetworkSpace(Builder builder) { - this.seed = builder.seed; - this.optimizationAlgo = builder.optimizationAlgo; - this.activationFunction = builder.activationFunction; - this.biasInit = builder.biasInit; - this.weightInit = builder.weightInit; - this.dist = builder.dist; - this.maxNumLineSearchIterations = builder.maxNumLineSearchIterations; - this.miniBatch = builder.miniBatch; - this.minimize = builder.minimize; - this.stepFunction = builder.stepFunction; - this.l1 = builder.l1; - this.l2 = builder.l2; - this.l1Bias = builder.l1Bias; - this.l2Bias = builder.l2Bias; - this.updater = builder.updater; - this.biasUpdater = builder.biasUpdater; - this.weightNoise = builder.weightNoise; - this.dropout = builder.dropout; - this.gradientNormalization = builder.gradientNormalization; - this.gradientNormalizationThreshold = builder.gradientNormalizationThreshold; - this.convolutionMode = builder.convolutionMode; - this.allParamConstraints = builder.allParamConstraints; - this.weightConstraints = builder.weightConstraints; - this.biasConstraints = builder.biasConstraints; - - this.backpropType = builder.backpropType; - this.tbpttFwdLength = builder.tbpttFwdLength; - this.tbpttBwdLength = builder.tbpttBwdLength; - - this.numEpochs = builder.numEpochs; - } - - protected BaseNetworkSpace() { - //Default constructor for Jackson json/yaml serialization - } - - - protected NeuralNetConfiguration.Builder randomGlobalConf(double[] values) { - //Create MultiLayerConfiguration... - NeuralNetConfiguration.Builder builder = new NeuralNetConfiguration.Builder(); - if (seed != null) - builder.seed(seed); - if (optimizationAlgo != null) - builder.optimizationAlgo(optimizationAlgo.getValue(values)); - if (activationFunction != null) - builder.activation(activationFunction.getValue(values)); - if (biasInit != null) - builder.biasInit(biasInit.getValue(values)); - if (weightInit != null) - builder.weightInit(weightInit.getValue(values)); - if (dist != null) - builder.dist(dist.getValue(values)); - if (maxNumLineSearchIterations != null) - builder.maxNumLineSearchIterations(maxNumLineSearchIterations.getValue(values)); - if (miniBatch != null) - builder.miniBatch(miniBatch.getValue(values)); - if (minimize != null) - builder.minimize(minimize.getValue(values)); - if (stepFunction != null) - builder.stepFunction(stepFunction.getValue(values)); - if (l1 != null) - builder.l1(l1.getValue(values)); - if (l2 != null) - builder.l2(l2.getValue(values)); - if (l1Bias != null) - builder.l1Bias(l1Bias.getValue(values)); - if (l2Bias != null) - builder.l2Bias(l2Bias.getValue(values)); - if (updater != null) - builder.updater(updater.getValue(values)); - if (biasUpdater != null) - builder.biasUpdater(biasUpdater.getValue(values)); - if (weightNoise != null) - builder.weightNoise(weightNoise.getValue(values)); - if (dropout != null) - builder.dropOut(dropout.getValue(values)); - if (gradientNormalization != null) - builder.gradientNormalization(gradientNormalization.getValue(values)); - if (gradientNormalizationThreshold != null) - builder.gradientNormalizationThreshold(gradientNormalizationThreshold.getValue(values)); - if (convolutionMode != null) - builder.convolutionMode(convolutionMode.getValue(values)); - if (allParamConstraints != null){ - List c = allParamConstraints.getValue(values); - if(c != null){ - builder.constrainAllParameters(c.toArray(new LayerConstraint[0])); - } - } - if (weightConstraints != null){ - List c = weightConstraints.getValue(values); - if(c != null){ - builder.constrainWeights(c.toArray(new LayerConstraint[0])); - } - } - if (biasConstraints != null){ - List c = biasConstraints.getValue(values); - if(c != null){ - builder.constrainBias(c.toArray(new LayerConstraint[0])); - } - } - - return builder; - } - - @Override - public List collectLeaves() { - Map global = getNestedSpaces(); - //Note: Results on previous line does NOT include the LayerSpaces, therefore we need to add these manually... - //This is because the type is a list, not a ParameterSpace - LinkedList stack = new LinkedList<>(); - stack.add(this); - - for (LayerConf layerConf : layerSpaces) { - LayerSpace ls = layerConf.getLayerSpace(); - stack.addAll(ls.collectLeaves()); - } - - List out = new ArrayList<>(); - while (!stack.isEmpty()) { - ParameterSpace next = stack.removeLast(); - if (next.isLeaf()) { - out.add(next); - } else { - Map m = next.getNestedSpaces(); - ParameterSpace[] arr = m.values().toArray(new ParameterSpace[0]); - for (int i = arr.length - 1; i >= 0; i--) { - stack.add(arr[i]); - } - } - } - return out; - } - - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - throw new UnsupportedOperationException("Cannot set indices for non leaf"); - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder(); - - for (Map.Entry e : getNestedSpaces().entrySet()) { - sb.append(e.getKey()).append(": ").append(e.getValue()).append("\n"); - } - - int i = 0; - for (LayerConf conf : layerSpaces) { - - sb.append("Layer config ").append(i++).append(": (Number layers:").append(conf.numLayers) - .append(", duplicate: ").append(conf.duplicateConfig).append("), ") - .append(conf.layerSpace.toString()).append("\n"); - } - - - return sb.toString(); - } - - @AllArgsConstructor - @Data - @NoArgsConstructor - public static class LayerConf { - protected LayerSpace layerSpace; - protected String layerName; - protected String[] inputs; - protected ParameterSpace numLayers; - protected boolean duplicateConfig; - protected InputPreProcessor preProcessor; - } - - @SuppressWarnings("unchecked") - protected abstract static class Builder> { - private Long seed; - private ParameterSpace optimizationAlgo; - private ParameterSpace activationFunction; - private ParameterSpace biasInit; - private ParameterSpace weightInit; - private ParameterSpace dist; - private ParameterSpace maxNumLineSearchIterations; - private ParameterSpace miniBatch; - private ParameterSpace minimize; - private ParameterSpace stepFunction; - private ParameterSpace l1; - private ParameterSpace l2; - private ParameterSpace l1Bias; - private ParameterSpace l2Bias; - private ParameterSpace updater; - private ParameterSpace biasUpdater; - private ParameterSpace weightNoise; - private ParameterSpace dropout; - private ParameterSpace gradientNormalization; - private ParameterSpace gradientNormalizationThreshold; - private ParameterSpace convolutionMode; - - private ParameterSpace> allParamConstraints; - private ParameterSpace> weightConstraints; - private ParameterSpace> biasConstraints; - - //NeuralNetConfiguration.ListBuilder/MultiLayerConfiguration.Builder options: - private ParameterSpace backpropType; - private ParameterSpace tbpttFwdLength; - private ParameterSpace tbpttBwdLength; - - //Early stopping configuration / (fixed) number of epochs: - private EarlyStoppingConfiguration earlyStoppingConfiguration; - private int numEpochs = 1; - - protected boolean validateOutputLayerConfig = true; - - public T seed(long seed) { - this.seed = seed; - return (T) this; - } - - public T optimizationAlgo(OptimizationAlgorithm optimizationAlgorithm) { - return optimizationAlgo(new FixedValue<>(optimizationAlgorithm)); - } - - public T optimizationAlgo(ParameterSpace parameterSpace) { - this.optimizationAlgo = parameterSpace; - return (T) this; - } - - - public T activation(Activation activationFunction) { - return activation(new FixedValue<>(activationFunction)); - } - - public T activation(ParameterSpace activationFunction) { - return activationFn(new ActivationParameterSpaceAdapter(activationFunction)); - } - - public T activationFn(ParameterSpace activationFunction) { - this.activationFunction = activationFunction; - return (T) this; - } - - public T biasInit(double biasInit){ - return biasInit(new FixedValue<>(biasInit)); - } - - public T biasInit(ParameterSpace biasInit){ - this.biasInit = biasInit; - return (T) this; - } - - public T weightInit(WeightInit weightInit) { - return weightInit(new FixedValue<>(weightInit)); - } - - public T weightInit(ParameterSpace weightInit) { - this.weightInit = weightInit; - return (T) this; - } - - public T dist(Distribution dist) { - return dist(new FixedValue<>(dist)); - } - - public T dist(ParameterSpace dist) { - this.dist = dist; - return (T) this; - } - - public T maxNumLineSearchIterations(int maxNumLineSearchIterations) { - return maxNumLineSearchIterations(new FixedValue<>(maxNumLineSearchIterations)); - } - - public T maxNumLineSearchIterations(ParameterSpace maxNumLineSearchIterations) { - this.maxNumLineSearchIterations = maxNumLineSearchIterations; - return (T) this; - } - - public T miniBatch(boolean minibatch) { - return miniBatch(new FixedValue<>(minibatch)); - } - - public T miniBatch(ParameterSpace miniBatch) { - this.miniBatch = miniBatch; - return (T) this; - } - - public T minimize(boolean minimize) { - return minimize(new FixedValue<>(minimize)); - } - - public T minimize(ParameterSpace minimize) { - this.minimize = minimize; - return (T) this; - } - - public T stepFunction(StepFunction stepFunction) { - return stepFunction(new FixedValue<>(stepFunction)); - } - - public T stepFunction(ParameterSpace stepFunction) { - this.stepFunction = stepFunction; - return (T) this; - } - - public T l1(double l1) { - return l1(new FixedValue<>(l1)); - } - - public T l1(ParameterSpace l1) { - this.l1 = l1; - return (T) this; - } - - public T l2(double l2) { - return l2(new FixedValue<>(l2)); - } - - public T l2(ParameterSpace l2) { - this.l2 = l2; - return (T) this; - } - public T l1Bias(double l1Bias) { - return l1Bias(new FixedValue<>(l1Bias)); - } - - public T l1Bias(ParameterSpace l1Bias) { - this.l1Bias = l1Bias; - return (T) this; - } - - public T l2Bias(double l2Bias) { - return l2Bias(new FixedValue<>(l2Bias)); - } - - public T l2Bias(ParameterSpace l2Bias) { - this.l2Bias = l2Bias; - return (T) this; - } - - public T updater(IUpdater updater){ - return updater(new FixedValue<>(updater)); - } - - public T updater(ParameterSpace updater) { - this.updater = updater; - return (T) this; - } - - public T biasUpdater(IUpdater biasUpdater){ - return biasUpdater(new FixedValue<>(biasUpdater)); - } - - public T biasUpdater(ParameterSpace biasUpdater){ - this.biasUpdater = biasUpdater; - return (T)this; - } - - public T weightNoise(IWeightNoise weightNoise){ - return weightNoise(new FixedValue<>(weightNoise)); - } - - public T weightNoise(ParameterSpace weightNoise){ - this.weightNoise = weightNoise; - return (T) this; - } - - public T dropOut(double dropout){ - return idropOut(new Dropout(dropout)); - } - - public T dropOut(ParameterSpace dropOut){ - return idropOut(new DropoutSpace(dropOut)); - } - - public T idropOut(IDropout idropOut){ - return idropOut(new FixedValue<>(idropOut)); - } - - public T idropOut(ParameterSpace idropOut){ - this.dropout = idropOut; - return (T) this; - } - - public T gradientNormalization(GradientNormalization gradientNormalization) { - return gradientNormalization(new FixedValue<>(gradientNormalization)); - } - - public T gradientNormalization(ParameterSpace gradientNormalization) { - this.gradientNormalization = gradientNormalization; - return (T) this; - } - - public T gradientNormalizationThreshold(double threshold) { - return gradientNormalizationThreshold(new FixedValue<>(threshold)); - } - - public T gradientNormalizationThreshold(ParameterSpace gradientNormalizationThreshold) { - this.gradientNormalizationThreshold = gradientNormalizationThreshold; - return (T) this; - } - - public T convolutionMode(ConvolutionMode convolutionMode) { - return convolutionMode(new FixedValue(convolutionMode)); - } - - public T convolutionMode(ParameterSpace convolutionMode) { - this.convolutionMode = convolutionMode; - return (T) this; - } - - public T backpropType(BackpropType backpropType) { - return backpropType(new FixedValue<>(backpropType)); - } - - public T backpropType(ParameterSpace backpropType) { - this.backpropType = backpropType; - return (T) this; - } - - public T tbpttFwdLength(int tbpttFwdLength) { - return tbpttFwdLength(new FixedValue<>(tbpttFwdLength)); - } - - public T tbpttFwdLength(ParameterSpace tbpttFwdLength) { - this.tbpttFwdLength = tbpttFwdLength; - return (T) this; - } - - public T tbpttBwdLength(int tbpttBwdLength) { - return tbpttBwdLength(new FixedValue<>(tbpttBwdLength)); - } - - public T tbpttBwdLength(ParameterSpace tbpttBwdLength) { - this.tbpttBwdLength = tbpttBwdLength; - return (T) this; - } - - public T constrainWeights(LayerConstraint... constraints){ - return constrainWeights(new FixedValue>(Arrays.asList(constraints))); - } - - public T constrainWeights(ParameterSpace> constraints){ - this.weightConstraints = constraints; - return (T) this; - } - - public T constrainBias(LayerConstraint... constraints){ - return constrainBias(new FixedValue>(Arrays.asList(constraints))); - } - - public T constrainBias(ParameterSpace> constraints){ - this.biasConstraints = constraints; - return (T) this; - } - - public T constrainAllParams(LayerConstraint... constraints){ - return constrainAllParams(new FixedValue>(Arrays.asList(constraints))); - } - - public T constrainAllParams(ParameterSpace> constraints){ - this.allParamConstraints = constraints; - return (T) this; - } - - public T validateOutputLayerConfig(boolean validate){ - this.validateOutputLayerConfig = validate; - return (T) this; - } - - /** - * Fixed number of training epochs. Default: 1 - * Note if both EarlyStoppingConfiguration and number of epochs is present, early stopping will be used in preference. - */ - public T numEpochs(int numEpochs) { - this.numEpochs = numEpochs; - return (T) this; - } - - - public abstract E build(); - } - - /** - * Return a json configuration of this configuration space. - * - * @return - */ - public String toJson() { - try { - return JsonMapper.getMapper().writeValueAsString(this); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } - - /** - * Return a yaml configuration of this configuration space. - * - * @return - */ - public String toYaml() { - try { - return YamlMapper.getMapper().writeValueAsString(this); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/ComputationGraphSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/ComputationGraphSpace.java deleted file mode 100644 index e15984910..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/ComputationGraphSpace.java +++ /dev/null @@ -1,318 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter; - -import lombok.*; -import org.deeplearning4j.arbiter.layers.LayerSpace; -import org.deeplearning4j.arbiter.layers.fixed.FixedLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.TaskCreatorProvider; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.deeplearning4j.arbiter.optimize.serde.jackson.YamlMapper; -import org.deeplearning4j.arbiter.task.ComputationGraphTaskCreator; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.nn.conf.ComputationGraphConfiguration; -import org.deeplearning4j.nn.conf.InputPreProcessor; -import org.deeplearning4j.nn.conf.NeuralNetConfiguration; -import org.deeplearning4j.nn.conf.WorkspaceMode; -import org.deeplearning4j.nn.conf.graph.GraphVertex; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.deeplearning4j.nn.conf.layers.Layer; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.nd4j.shade.jackson.annotation.JsonProperty; -import org.nd4j.shade.jackson.annotation.JsonTypeInfo; -import org.nd4j.shade.jackson.annotation.JsonTypeName; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; - -/** - * ComputationGraphSpace: Defines the space of valid hyperparameters for a ComputationGraph. - * Note that this for fixed graph structures only - * - * @author Alex Black - */ -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON ser/de -@Data -@EqualsAndHashCode(callSuper = true) -@JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY, property = "@class") -@JsonTypeName("ComputationGraphSpace") -public class ComputationGraphSpace extends BaseNetworkSpace { - static { - TaskCreatorProvider.registerDefaultTaskCreatorClass(ComputationGraphSpace.class, ComputationGraphTaskCreator.class); - } - - @JsonProperty - protected List layerSpaces = new ArrayList<>(); - @JsonProperty - protected List vertices = new ArrayList<>(); - @JsonProperty - protected String[] networkInputs; - @JsonProperty - protected String[] networkOutputs; - @JsonProperty - protected ParameterSpace inputTypes; - @JsonProperty - protected int numParameters; - @JsonProperty - protected WorkspaceMode trainingWorkspaceMode; - @JsonProperty - protected WorkspaceMode inferenceWorkspaceMode; - @JsonProperty - protected boolean validateOutputLayerConfig = true; - - //Early stopping configuration / (fixed) number of epochs: - protected EarlyStoppingConfiguration earlyStoppingConfiguration; - - protected ComputationGraphSpace(Builder builder) { - super(builder); - - this.earlyStoppingConfiguration = builder.earlyStoppingConfiguration; - this.layerSpaces = builder.layerList; - this.vertices = builder.vertexList; - - this.networkInputs = builder.networkInputs; - this.networkOutputs = builder.networkOutputs; - this.inputTypes = builder.inputTypes; - this.trainingWorkspaceMode = builder.trainingWorkspaceMode; - this.inferenceWorkspaceMode = builder.inferenceWorkspaceMode; - this.validateOutputLayerConfig = builder.validateOutputLayerConfig; - - //Determine total number of parameters: - List list = LeafUtils.getUniqueObjects(collectLeaves()); - for (ParameterSpace ps : list) - numParameters += ps.numParameters(); - } - - - @Override - public GraphConfiguration getValue(double[] values) { - //Create ComputationGraphConfiguration... - NeuralNetConfiguration.Builder builder = randomGlobalConf(values); - - ComputationGraphConfiguration.GraphBuilder graphBuilder = builder.graphBuilder(); - graphBuilder.addInputs(this.networkInputs); - graphBuilder.setOutputs(this.networkOutputs); - if (inputTypes != null) - graphBuilder.setInputTypes(inputTypes.getValue(values)); - - //Build/add our layers and vertices: - for (LayerConf c : layerSpaces) { - org.deeplearning4j.nn.conf.layers.Layer l = c.layerSpace.getValue(values); - graphBuilder.addLayer(c.getLayerName(), l, c.getPreProcessor(), c.getInputs()); - } - for (VertexConf gv : vertices) { - graphBuilder.addVertex(gv.getVertexName(), gv.getGraphVertex(), gv.getInputs()); - } - - if (backpropType != null) - graphBuilder.backpropType(backpropType.getValue(values)); - if (tbpttFwdLength != null) - graphBuilder.tBPTTForwardLength(tbpttFwdLength.getValue(values)); - if (tbpttBwdLength != null) - graphBuilder.tBPTTBackwardLength(tbpttBwdLength.getValue(values)); - graphBuilder.validateOutputLayerConfig(validateOutputLayerConfig); - - ComputationGraphConfiguration configuration = graphBuilder.build(); - - if (trainingWorkspaceMode != null) - configuration.setTrainingWorkspaceMode(trainingWorkspaceMode); - if (inferenceWorkspaceMode != null) - configuration.setInferenceWorkspaceMode(inferenceWorkspaceMode); - - return new GraphConfiguration(configuration, earlyStoppingConfiguration, numEpochs); - } - - @Override - public int numParameters() { - return numParameters; - } - - @Override - public List collectLeaves() { - List list = super.collectLeaves(); - for (LayerConf lc : layerSpaces) { - list.addAll(lc.layerSpace.collectLeaves()); - } - if (inputTypes != null) - list.add(inputTypes); - return list; - } - - - @Override - public String toString() { - StringBuilder sb = new StringBuilder(super.toString()); - - for (LayerConf conf : layerSpaces) { - sb.append("Layer config: \"").append(conf.layerName).append("\", ").append(conf.layerSpace) - .append(", inputs: ").append(conf.inputs == null ? "[]" : Arrays.toString(conf.inputs)) - .append("\n"); - } - - for (VertexConf conf : vertices) { - sb.append("GraphVertex: \"").append(conf.vertexName).append("\", ").append(conf.graphVertex) - .append(", inputs: ").append(conf.inputs == null ? "[]" : Arrays.toString(conf.inputs)) - .append("\n"); - } - - if (earlyStoppingConfiguration != null) { - sb.append("Early stopping configuration:").append(earlyStoppingConfiguration.toString()).append("\n"); - } else { - sb.append("Training # epochs:").append(numEpochs).append("\n"); - } - - if (inputTypes != null) { - sb.append("Input types: ").append(inputTypes).append("\n"); - } - - return sb.toString(); - } - - @AllArgsConstructor - @Data - @NoArgsConstructor //For Jackson JSON - protected static class VertexConf { - protected GraphVertex graphVertex; - protected String vertexName; - protected String[] inputs; - } - - public static class Builder extends BaseNetworkSpace.Builder { - - protected List layerList = new ArrayList<>(); - protected List vertexList = new ArrayList<>(); - protected EarlyStoppingConfiguration earlyStoppingConfiguration; - protected String[] networkInputs; - protected String[] networkOutputs; - protected ParameterSpace inputTypes; - protected WorkspaceMode trainingWorkspaceMode; - protected WorkspaceMode inferenceWorkspaceMode; - - //Need: input types - //Early stopping configuration - //Graph nodes - - /** - * Early stopping configuration (optional). Note if both EarlyStoppingConfiguration and number of epochs is - * present, early stopping will be used in preference. - */ - public Builder earlyStoppingConfiguration( - EarlyStoppingConfiguration earlyStoppingConfiguration) { - this.earlyStoppingConfiguration = earlyStoppingConfiguration; - return this; - } - - public Builder layer(String layerName, LayerSpace layerSpace, String... layerInputs){ - return addLayer(layerName, layerSpace, layerInputs); - } - - public Builder layer(String layerName, LayerSpace layerSpace, InputPreProcessor preProcessor, - String... layerInputs) { - return addLayer(layerName, layerSpace, preProcessor, layerInputs); - } - - public Builder layer(String layerName, Layer layer, String... layerInputs){ - return layer(layerName, new FixedLayerSpace<>(layer), layerInputs); - } - - public Builder addLayer(String layerName, LayerSpace layerSpace, String... layerInputs) { - layerList.add(new LayerConf(layerSpace, layerName, layerInputs, new FixedValue<>(1), false, null)); - return this; - } - - public Builder addLayer(String layerName, LayerSpace layerSpace, InputPreProcessor preProcessor, - String... layerInputs){ - layerList.add(new LayerConf(layerSpace, layerName, layerInputs, new FixedValue<>(1), false, preProcessor)); - return this; - } - - public Builder addVertex(String vertexName, GraphVertex vertex, String... vertexInputs) { - vertexList.add(new VertexConf(vertex, vertexName, vertexInputs)); - return this; - } - - public Builder addInputs(String... networkInputs) { - this.networkInputs = networkInputs; - return this; - } - - public Builder setOutputs(String... networkOutputs) { - this.networkOutputs = networkOutputs; - return this; - } - - public Builder setInputTypes(InputType... inputTypes) { - return setInputTypes(new FixedValue(inputTypes)); - } - - public Builder setInputTypes(ParameterSpace inputTypes) { - this.inputTypes = inputTypes; - return this; - } - - public Builder trainingWorkspaceMode(WorkspaceMode workspaceMode){ - this.trainingWorkspaceMode = workspaceMode; - return this; - } - - public Builder inferenceWorkspaceMode(WorkspaceMode workspaceMode){ - this.inferenceWorkspaceMode = workspaceMode; - return this; - } - - @SuppressWarnings("unchecked") - public ComputationGraphSpace build() { - return new ComputationGraphSpace(this); - } - } - - - /** - * Instantiate a computation graph space from - * a raw json string - * @param json - * @return - */ - public static ComputationGraphSpace fromJson(String json) { - try { - return JsonMapper.getMapper().readValue(json, ComputationGraphSpace.class); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - /** - * Instantiate a computation graph space - * from a raw yaml string - * @param yaml - * @return - */ - public static ComputationGraphSpace fromYaml(String yaml) { - try { - return YamlMapper.getMapper().readValue(yaml, ComputationGraphSpace.class); - } catch (IOException e) { - throw new RuntimeException(e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/DL4JConfiguration.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/DL4JConfiguration.java deleted file mode 100644 index 3ec05e52b..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/DL4JConfiguration.java +++ /dev/null @@ -1,75 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter; - -import lombok.AllArgsConstructor; -import lombok.Data; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.deeplearning4j.arbiter.optimize.serde.jackson.YamlMapper; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.nn.conf.MultiLayerConfiguration; -import org.nd4j.shade.jackson.core.JsonProcessingException; -import org.nd4j.shade.jackson.databind.annotation.JsonSerialize; - -import java.io.Serializable; - -/** - * DL4JConfiguration: simple configuration method that contains the following:
- * - MultiLayerConfiguration
- * - Early stopping settings, OR number of epochs
- * Note: if early stopping configuration is absent, a fixed number of epochs (default: 1) will be used. - * If both early stopping and number of epochs is present: early stopping will be used. - */ -@AllArgsConstructor -@Data -public class DL4JConfiguration implements Serializable { - @JsonSerialize - private MultiLayerConfiguration multiLayerConfiguration; - @JsonSerialize - private EarlyStoppingConfiguration earlyStoppingConfiguration; - @JsonSerialize - private Integer numEpochs; - - - /** - * Yaml mapping - * @return - */ - public String toYaml() { - try { - return YamlMapper.getMapper().writeValueAsString(this); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } - - /** - * Json mapping - * @return - */ - public String toJson() { - try { - return JsonMapper.getMapper().writeValueAsString(this); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } - - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/GraphConfiguration.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/GraphConfiguration.java deleted file mode 100644 index 1f43884e5..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/GraphConfiguration.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter; - - -import lombok.AllArgsConstructor; -import lombok.Data; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.deeplearning4j.arbiter.optimize.serde.jackson.YamlMapper; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.nn.conf.ComputationGraphConfiguration; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.nd4j.shade.jackson.core.JsonProcessingException; - -import java.io.Serializable; - -/** - * Analogous to {@link DL4JConfiguration}, GraphConfiguration includes a configuration for ComputationGraphs, as well - * as early stopping (or, optionally numEpochs) fields. - */ -@AllArgsConstructor -@Data -public class GraphConfiguration implements Serializable { - private ComputationGraphConfiguration configuration; - private EarlyStoppingConfiguration earlyStoppingConfiguration; - private Integer numEpochs; - - - - /** - * Yaml mapping - * @return - */ - public String toYaml() { - try { - return YamlMapper.getMapper().writeValueAsString(this); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } - - /** - * Json mapping - * @return - */ - public String toJson() { - try { - return JsonMapper.getMapper().writeValueAsString(this); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/MultiLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/MultiLayerSpace.java deleted file mode 100644 index af1f919f7..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/MultiLayerSpace.java +++ /dev/null @@ -1,322 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.layers.LayerSpace; -import org.deeplearning4j.arbiter.layers.fixed.FixedLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.TaskCreatorProvider; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.deeplearning4j.arbiter.optimize.serde.jackson.YamlMapper; -import org.deeplearning4j.arbiter.task.MultiLayerNetworkTaskCreator; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.nn.conf.InputPreProcessor; -import org.deeplearning4j.nn.conf.MultiLayerConfiguration; -import org.deeplearning4j.nn.conf.NeuralNetConfiguration; -import org.deeplearning4j.nn.conf.WorkspaceMode; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.deeplearning4j.nn.conf.layers.Layer; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.List; -import java.util.Map; - -@Data -@EqualsAndHashCode(callSuper = true) -public class MultiLayerSpace extends BaseNetworkSpace { - - static { - TaskCreatorProvider.registerDefaultTaskCreatorClass(MultiLayerSpace.class, MultiLayerNetworkTaskCreator.class); - } - - @JsonProperty - protected ParameterSpace inputType; - @JsonProperty - protected ParameterSpace> inputPreProcessors; - - //Early stopping configuration / (fixed) number of epochs: - @JsonProperty - protected EarlyStoppingConfiguration earlyStoppingConfiguration; - @JsonProperty - protected int numParameters; - @JsonProperty - protected WorkspaceMode trainingWorkspaceMode; - @JsonProperty - protected WorkspaceMode inferenceWorkspaceMode; - @JsonProperty - protected boolean validateOutputLayerConfig = true; - - - protected MultiLayerSpace(Builder builder) { - super(builder); - this.inputType = builder.inputType; - this.inputPreProcessors = builder.inputPreProcessors; - - this.earlyStoppingConfiguration = builder.earlyStoppingConfiguration; - - this.layerSpaces = builder.layerSpaces; - - //Determine total number of parameters: - //Collect the leaves, and make sure they are unique. - //Note that the *object instances* must be unique - and consequently we don't want to use .equals(), as - // this would incorrectly filter out equal range parameter spaces - List allLeaves = collectLeaves(); - List list = LeafUtils.getUniqueObjects(allLeaves); - - for (ParameterSpace ps : list) { - int n = ps.numParameters(); - numParameters += ps.numParameters(); - } - - this.trainingWorkspaceMode = builder.trainingWorkspaceMode; - this.inferenceWorkspaceMode = builder.inferenceWorkspaceMode; - this.validateOutputLayerConfig = builder.validateOutputLayerConfig; - } - - protected MultiLayerSpace() { - //Default constructor for Jackson json/yaml serialization - } - - @Override - public DL4JConfiguration getValue(double[] values) { - //First: create layer configs - List layers = new ArrayList<>(); - for (LayerConf c : layerSpaces) { - int n = c.numLayers.getValue(values); - if (c.duplicateConfig) { - //Generate N identical configs - org.deeplearning4j.nn.conf.layers.Layer l = c.layerSpace.getValue(values); - for (int i = 0; i < n; i++) { - layers.add(l.clone()); - } - } else { - throw new UnsupportedOperationException("Not yet implemented"); - } - } - - //Create MultiLayerConfiguration... - NeuralNetConfiguration.Builder builder = randomGlobalConf(values); - - NeuralNetConfiguration.ListBuilder listBuilder = builder.list(); - for (int i = 0; i < layers.size(); i++) { - listBuilder.layer(i, layers.get(i)); - } - - if (backpropType != null) - listBuilder.backpropType(backpropType.getValue(values)); - if (tbpttFwdLength != null) - listBuilder.tBPTTForwardLength(tbpttFwdLength.getValue(values)); - if (tbpttBwdLength != null) - listBuilder.tBPTTBackwardLength(tbpttBwdLength.getValue(values)); - if (inputType != null) - listBuilder.setInputType(inputType.getValue(values)); - if (inputPreProcessors != null) - listBuilder.setInputPreProcessors(inputPreProcessors.getValue(values)); - listBuilder.validateOutputLayerConfig(validateOutputLayerConfig); - - MultiLayerConfiguration configuration = listBuilder.build(); - - if (trainingWorkspaceMode != null) - configuration.setTrainingWorkspaceMode(trainingWorkspaceMode); - if (inferenceWorkspaceMode != null) - configuration.setInferenceWorkspaceMode(inferenceWorkspaceMode); - - - return new DL4JConfiguration(configuration, earlyStoppingConfiguration, numEpochs); - } - - @Override - public int numParameters() { - return numParameters; - } - - @Override - public List collectLeaves() { - List list = super.collectLeaves(); - for (LayerConf lc : layerSpaces) { - list.addAll(lc.numLayers.collectLeaves()); - list.addAll(lc.layerSpace.collectLeaves()); - } - if (inputType != null) - list.addAll(inputType.collectLeaves()); - if (inputPreProcessors != null) - list.addAll(inputPreProcessors.collectLeaves()); - return list; - } - - - @Override - public String toString() { - StringBuilder sb = new StringBuilder(super.toString()); - - int i = 0; - for (LayerConf conf : layerSpaces) { - - sb.append("Layer config ").append(i++).append(": (Number layers:").append(conf.numLayers) - .append(", duplicate: ").append(conf.duplicateConfig).append("), ") - .append(conf.layerSpace.toString()).append("\n"); - } - - if (inputType != null) - sb.append("inputType: ").append(inputType).append("\n"); - if (inputPreProcessors != null) - sb.append("inputPreProcessors: ").append(inputPreProcessors).append("\n"); - - if (earlyStoppingConfiguration != null) { - sb.append("Early stopping configuration:").append(earlyStoppingConfiguration.toString()).append("\n"); - } else { - sb.append("Training # epochs:").append(numEpochs).append("\n"); - } - - return sb.toString(); - } - - public LayerSpace getLayerSpace(int layerNumber) { - return layerSpaces.get(layerNumber).getLayerSpace(); - } - - public static class Builder extends BaseNetworkSpace.Builder { - protected List layerSpaces = new ArrayList<>(); - protected ParameterSpace inputType; - protected ParameterSpace> inputPreProcessors; - protected WorkspaceMode trainingWorkspaceMode; - protected WorkspaceMode inferenceWorkspaceMode; - - //Early stopping configuration - protected EarlyStoppingConfiguration earlyStoppingConfiguration; - - - - public Builder setInputType(InputType inputType) { - return setInputType(new FixedValue<>(inputType)); - } - - public Builder setInputType(ParameterSpace inputType) { - this.inputType = inputType; - return this; - } - - public Builder layer(Layer layer){ - return layer(new FixedLayerSpace<>(layer)); - } - - public Builder layer(LayerSpace layerSpace) { - return layer(layerSpace, new FixedValue<>(1)); - } - - public Builder layer(LayerSpace layerSpace, ParameterSpace numLayersDistribution) { - return addLayer(layerSpace, numLayersDistribution); - } - - - public Builder addLayer(LayerSpace layerSpace) { - return addLayer(layerSpace, new FixedValue<>(1)); - } - - /** - * duplicateConfig not supported. Will always be true - * @param layerSpace - * @param numLayersDistribution - * @param duplicateConfig - * @return - */ - @Deprecated - public Builder addLayer(LayerSpace layerSpace, ParameterSpace numLayersDistribution, boolean duplicateConfig) { - if (!duplicateConfig) throw new IllegalArgumentException("Duplicate Config false not supported"); - String layerName = "layer_" + layerSpaces.size(); - duplicateConfig = true; //hard coded to always duplicate layers - layerSpaces.add(new LayerConf(layerSpace, layerName, null, numLayersDistribution, duplicateConfig, null)); - return this; - } - - /** - * @param layerSpace - * @param numLayersDistribution Distribution for number of layers to generate - */ - public Builder addLayer(LayerSpace layerSpace, ParameterSpace numLayersDistribution) { - String layerName = "layer_" + layerSpaces.size(); - boolean duplicateConfig = true; //hard coded to always duplicate layers - layerSpaces.add(new LayerConf(layerSpace, layerName, null, numLayersDistribution, duplicateConfig, null)); - return this; - } - - /** - * Early stopping configuration (optional). Note if both EarlyStoppingConfiguration and number of epochs is - * present, early stopping will be used in preference. - */ - public Builder earlyStoppingConfiguration( - EarlyStoppingConfiguration earlyStoppingConfiguration) { - this.earlyStoppingConfiguration = earlyStoppingConfiguration; - return this; - } - - /** - * @param inputPreProcessors Input preprocessors to set for the model - */ - public Builder setInputPreProcessors(Map inputPreProcessors) { - return setInputPreProcessors(new FixedValue<>(inputPreProcessors)); - } - - /** - * @param inputPreProcessors Input preprocessors to set for the model - */ - public Builder setInputPreProcessors(ParameterSpace> inputPreProcessors) { - this.inputPreProcessors = inputPreProcessors; - return this; - } - - public Builder trainingWorkspaceMode(WorkspaceMode workspaceMode){ - this.trainingWorkspaceMode = workspaceMode; - return this; - } - - public Builder inferenceWorkspaceMode(WorkspaceMode workspaceMode){ - this.inferenceWorkspaceMode = workspaceMode; - return this; - } - - @SuppressWarnings("unchecked") - public MultiLayerSpace build() { - return new MultiLayerSpace(this); - } - } - - public static MultiLayerSpace fromJson(String json) { - try { - return JsonMapper.getMapper().readValue(json, MultiLayerSpace.class); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - public static MultiLayerSpace fromYaml(String yaml) { - try { - return YamlMapper.getMapper().readValue(yaml, MultiLayerSpace.class); - } catch (IOException e) { - throw new RuntimeException(e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/adapter/ActivationParameterSpaceAdapter.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/adapter/ActivationParameterSpaceAdapter.java deleted file mode 100644 index 0c59c3b44..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/adapter/ActivationParameterSpaceAdapter.java +++ /dev/null @@ -1,60 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.adapter; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.adapter.ParameterSpaceAdapter; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -/** - * A simple class to adapt a {@link Activation} parameter space to a {@link IActivation} parameter space - * - * @author Alex Black - */ -@Data -@NoArgsConstructor -@EqualsAndHashCode(callSuper = false) -public class ActivationParameterSpaceAdapter extends ParameterSpaceAdapter { - - private ParameterSpace activation; - - public ActivationParameterSpaceAdapter(@JsonProperty("activation") ParameterSpace activation) { - this.activation = activation; - } - - @Override - public IActivation convertValue(Activation from) { - return from.getActivationFunction(); - } - - @Override - protected ParameterSpace underlying() { - return activation; - } - - @Override - protected String underlyingName() { - return "activation"; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/adapter/LossFunctionParameterSpaceAdapter.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/adapter/LossFunctionParameterSpaceAdapter.java deleted file mode 100644 index 40cb1d59c..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/adapter/LossFunctionParameterSpaceAdapter.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.adapter; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.adapter.ParameterSpaceAdapter; -import org.nd4j.linalg.lossfunctions.ILossFunction; -import org.nd4j.linalg.lossfunctions.LossFunctions; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -/** - * A simple class to adapt a {@link LossFunctions.LossFunction} parameter space to a {@link ILossFunction} parameter space - * - * @author Alex Black - */ -@Data -@NoArgsConstructor -@EqualsAndHashCode(callSuper = false) -public class LossFunctionParameterSpaceAdapter - extends ParameterSpaceAdapter { - - private ParameterSpace lossFunction; - - public LossFunctionParameterSpaceAdapter( - @JsonProperty("lossFunction") ParameterSpace lossFunction) { - this.lossFunction = lossFunction; - } - - @Override - protected ILossFunction convertValue(LossFunctions.LossFunction from) { - return from.getILossFunction(); - } - - @Override - protected ParameterSpace underlying() { - return lossFunction; - } - - @Override - protected String underlyingName() { - return "lossFunction"; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/dropout/DropoutSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/dropout/DropoutSpace.java deleted file mode 100644 index fbd418fbc..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/dropout/DropoutSpace.java +++ /dev/null @@ -1,65 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.dropout; - -import lombok.AllArgsConstructor; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.AbstractParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.nn.conf.dropout.Dropout; -import org.deeplearning4j.nn.conf.dropout.IDropout; - -import java.util.List; - -@AllArgsConstructor -@NoArgsConstructor -public class DropoutSpace extends AbstractParameterSpace { - - private ParameterSpace dropout; - - @Override - public Dropout getValue(double[] parameterValues) { - double p = dropout.getValue(parameterValues); - if(p == 0){ - //Special case: 0 dropout = "disabled" in DL4J. But Dropout class doesn't support this - return null; - } - return new Dropout(p); - } - - @Override - public int numParameters() { - return dropout.numParameters(); - } - - @Override - public List collectLeaves() { - return dropout.collectLeaves(); - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - dropout.setIndices(indices); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdaGradSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdaGradSpace.java deleted file mode 100644 index 8d1c0f7b1..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdaGradSpace.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.nd4j.linalg.learning.config.AdaGrad; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -@Data -@EqualsAndHashCode(callSuper = false) -public class AdaGradSpace extends BaseUpdaterSpace { - - private ParameterSpace learningRate; - private ParameterSpace lrSchedule; - - public AdaGradSpace(ParameterSpace learningRate) { - this(learningRate, null); - } - - public static AdaGradSpace withLR(ParameterSpace lr){ - return new AdaGradSpace(lr, null); - } - - public static AdaGradSpace withLRSchedule(ParameterSpace lrSchedule){ - return new AdaGradSpace(null, lrSchedule); - } - - protected AdaGradSpace(@JsonProperty("learningRate") ParameterSpace learningRate, - @JsonProperty("lrSchedule") ParameterSpace lrSchedule){ - this.learningRate = learningRate; - this.lrSchedule = lrSchedule; - } - - @Override - public IUpdater getValue(double[] parameterValues) { - if(lrSchedule != null){ - return new AdaGrad(lrSchedule.getValue(parameterValues)); - } else { - return new AdaGrad(learningRate.getValue(parameterValues)); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdaMaxSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdaMaxSpace.java deleted file mode 100644 index def1e96a5..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdaMaxSpace.java +++ /dev/null @@ -1,85 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.nd4j.linalg.learning.config.AdaMax; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -@Data -@EqualsAndHashCode(callSuper = false) -public class AdaMaxSpace extends BaseUpdaterSpace { - - private ParameterSpace learningRate; - private ParameterSpace learningRateSchedule; - private ParameterSpace beta1; - private ParameterSpace beta2; - private ParameterSpace epsilon; - - public AdaMaxSpace(ParameterSpace learningRate) { - this(learningRate, null, null, null); - } - - public AdaMaxSpace(ParameterSpace learningRate, ParameterSpace beta1, - ParameterSpace beta2, ParameterSpace epsilon) { - this(learningRate, null, beta1, beta2, epsilon); - } - - public AdaMaxSpace(@JsonProperty("learningRate") ParameterSpace learningRate, - @JsonProperty("learningRateSchedule") ParameterSpace learningRateSchedule, - @JsonProperty("beta1") ParameterSpace beta1, - @JsonProperty("beta2") ParameterSpace beta2, - @JsonProperty("epsilon") ParameterSpace epsilon){ - this.learningRate = learningRate; - this.learningRateSchedule = learningRateSchedule; - this.beta1 = beta1; - this.beta2 = beta2; - this.epsilon = epsilon; - } - - public static AdaMaxSpace withLR(ParameterSpace lr){ - return new AdaMaxSpace(lr, null, null, null, null); - } - - public static AdaMaxSpace withLRSchedule(ParameterSpace lrSchedule){ - return new AdaMaxSpace(null, lrSchedule, null, null, null); - } - - @Override - public IUpdater getValue(double[] parameterValues) { - double lr = learningRate == null ? AdaMax.DEFAULT_ADAMAX_LEARNING_RATE : learningRate.getValue(parameterValues); - ISchedule lrS = learningRateSchedule == null ? null : learningRateSchedule.getValue(parameterValues); - double b1 = beta1 == null ? AdaMax.DEFAULT_ADAMAX_LEARNING_RATE : beta1.getValue(parameterValues); - double b2 = beta2 == null ? AdaMax.DEFAULT_ADAMAX_LEARNING_RATE : beta2.getValue(parameterValues); - double eps = epsilon == null ? AdaMax.DEFAULT_ADAMAX_LEARNING_RATE : epsilon.getValue(parameterValues); - if(lrS == null){ - return new AdaMax(lr, b1, b2, eps); - } else { - AdaMax a = new AdaMax(lrS); - a.setBeta1(b1); - a.setBeta2(b2); - a.setEpsilon(eps); - return a; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdamSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdamSpace.java deleted file mode 100644 index e502fd478..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/AdamSpace.java +++ /dev/null @@ -1,85 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.nd4j.linalg.learning.config.Adam; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -@Data -@EqualsAndHashCode(callSuper = false) -public class AdamSpace extends BaseUpdaterSpace { - - private ParameterSpace learningRate; - private ParameterSpace learningRateSchedule; - private ParameterSpace beta1; - private ParameterSpace beta2; - private ParameterSpace epsilon; - - public AdamSpace(ParameterSpace learningRate) { - this(learningRate, null, null, null); - } - - public AdamSpace(ParameterSpace learningRate, ParameterSpace beta1, - ParameterSpace beta2, ParameterSpace epsilon) { - this(learningRate, null, beta1, beta2, epsilon); - } - - public static AdamSpace withLR(ParameterSpace lr){ - return new AdamSpace(lr, null, null, null, null); - } - - public static AdamSpace withLRSchedule(ParameterSpace lrSchedule){ - return new AdamSpace(null, lrSchedule, null, null, null); - } - - protected AdamSpace(@JsonProperty("learningRate") ParameterSpace learningRate, - @JsonProperty("learningRateSchedule") ParameterSpace learningRateSchedule, - @JsonProperty("beta1") ParameterSpace beta1, - @JsonProperty("beta2") ParameterSpace beta2, - @JsonProperty("epsilon") ParameterSpace epsilon){ - this.learningRate = learningRate; - this.learningRateSchedule = learningRateSchedule; - this.beta1 = beta1; - this.beta2 = beta2; - this.epsilon = epsilon; - } - - @Override - public IUpdater getValue(double[] parameterValues) { - double lr = learningRate == null ? Adam.DEFAULT_ADAM_LEARNING_RATE : learningRate.getValue(parameterValues); - ISchedule lrS = learningRateSchedule == null ? null : learningRateSchedule.getValue(parameterValues); - double b1 = beta1 == null ? Adam.DEFAULT_ADAM_LEARNING_RATE : beta1.getValue(parameterValues); - double b2 = beta2 == null ? Adam.DEFAULT_ADAM_LEARNING_RATE : beta2.getValue(parameterValues); - double eps = epsilon == null ? Adam.DEFAULT_ADAM_LEARNING_RATE : epsilon.getValue(parameterValues); - if(lrS == null){ - return new Adam(lr, b1, b2, eps); - } else { - Adam a = new Adam(lrS); - a.setBeta1(b1); - a.setBeta2(b2); - a.setEpsilon(eps); - return a; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/BaseUpdaterSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/BaseUpdaterSpace.java deleted file mode 100644 index 4c7e506d3..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/BaseUpdaterSpace.java +++ /dev/null @@ -1,70 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater; - -import lombok.Data; -import org.deeplearning4j.arbiter.optimize.api.AbstractParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.nd4j.linalg.learning.config.IUpdater; - -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; -import java.util.Map; - -@Data -public abstract class BaseUpdaterSpace extends AbstractParameterSpace { - - @Override - public int numParameters() { - int count = 0; - for(ParameterSpace p : collectLeaves()){ - count += p.numParameters(); - } - return count; - } - - @Override - public List collectLeaves() { - Map nested = getNestedSpaces(); - List out = new ArrayList<>(); - for(ParameterSpace p : nested.values()){ - out.addAll(p.collectLeaves()); - } - return out; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices){ - int soFar = 0; - for(ParameterSpace p : collectLeaves()){ - int numParams = p.numParameters(); - if(numParams <= 0){ - continue; - } - int[] subset = Arrays.copyOfRange(indices, soFar, soFar + numParams); - p.setIndices(subset); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/NadamSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/NadamSpace.java deleted file mode 100644 index 27877f4e0..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/NadamSpace.java +++ /dev/null @@ -1,85 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.nd4j.linalg.learning.config.Nadam; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -@Data -@EqualsAndHashCode(callSuper = false) -public class NadamSpace extends BaseUpdaterSpace { - - private ParameterSpace learningRate; - private ParameterSpace learningRateSchedule; - private ParameterSpace beta1; - private ParameterSpace beta2; - private ParameterSpace epsilon; - - public NadamSpace(ParameterSpace learningRate) { - this(learningRate, null, null, null); - } - - public NadamSpace(ParameterSpace learningRate, ParameterSpace beta1, - ParameterSpace beta2, ParameterSpace epsilon) { - this(learningRate, null, beta1, beta2, epsilon); - } - - public NadamSpace(@JsonProperty("learningRate") ParameterSpace learningRate, - @JsonProperty("learningRateSchedule") ParameterSpace learningRateSchedule, - @JsonProperty("beta1") ParameterSpace beta1, - @JsonProperty("beta2") ParameterSpace beta2, - @JsonProperty("epsilon") ParameterSpace epsilon){ - this.learningRate = learningRate; - this.learningRateSchedule = learningRateSchedule; - this.beta1 = beta1; - this.beta2 = beta2; - this.epsilon = epsilon; - } - - public static NadamSpace withLR(ParameterSpace lr){ - return new NadamSpace(lr, null, null, null, null); - } - - public static NadamSpace withLRSchedule(ParameterSpace lrSchedule){ - return new NadamSpace(null, lrSchedule, null, null, null); - } - - @Override - public IUpdater getValue(double[] parameterValues) { - double lr = learningRate == null ? Nadam.DEFAULT_NADAM_LEARNING_RATE : learningRate.getValue(parameterValues); - ISchedule lrS = learningRateSchedule == null ? null : learningRateSchedule.getValue(parameterValues); - double b1 = beta1 == null ? Nadam.DEFAULT_NADAM_LEARNING_RATE : beta1.getValue(parameterValues); - double b2 = beta2 == null ? Nadam.DEFAULT_NADAM_LEARNING_RATE : beta2.getValue(parameterValues); - double eps = epsilon == null ? Nadam.DEFAULT_NADAM_LEARNING_RATE : epsilon.getValue(parameterValues); - if(lrS == null){ - return new Nadam(lr, b1, b2, eps); - } else { - Nadam a = new Nadam(lrS); - a.setBeta1(b1); - a.setBeta2(b2); - a.setEpsilon(eps); - return a; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/NesterovsSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/NesterovsSpace.java deleted file mode 100644 index 9580e8b18..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/NesterovsSpace.java +++ /dev/null @@ -1,102 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.linalg.learning.config.Nesterovs; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -@Data -@EqualsAndHashCode(callSuper = false) -public class NesterovsSpace extends BaseUpdaterSpace { - - protected ParameterSpace learningRate; - protected ParameterSpace learningRateSchedule; - protected ParameterSpace momentum; - protected ParameterSpace momentumSchedule; - - public NesterovsSpace(ParameterSpace learningRate) { - this(learningRate, null); - } - - public NesterovsSpace(ParameterSpace learningRate, ParameterSpace momentum) { - this(learningRate, null, momentum, null); - } - - public NesterovsSpace(@JsonProperty("learningRate") ParameterSpace learningRate, - @JsonProperty("learningRateSchedule") ParameterSpace learningRateSchedule, - @JsonProperty("momentum") ParameterSpace momentum, - @JsonProperty("momentumSchedule") ParameterSpace momentumSchedule) { - this.learningRate = learningRate; - this.learningRateSchedule = learningRateSchedule; - this.momentum = momentum; - this.momentumSchedule = momentumSchedule; - } - - public static NesterovsSpace withLR(ParameterSpace lr){ - return new NesterovsSpace(lr, null, null, null); - } - - public static NesterovsSpace withLR(ParameterSpace lr, double momentum){ - return new NesterovsSpace(lr, null, new FixedValue<>(momentum), null); - } - - public static NesterovsSpace withLR(ParameterSpace lr, ParameterSpace momentum){ - return new NesterovsSpace(lr, null, momentum, null); - } - - public static NesterovsSpace withLRSchedule(ParameterSpace lrSchedule){ - return new NesterovsSpace(null, lrSchedule, null, null); - } - - public static NesterovsSpace withLRSchedule(ParameterSpace lrSchedule, double momentum){ - return new NesterovsSpace(null, lrSchedule, new FixedValue<>(momentum), null); - } - - public static NesterovsSpace withLRSchedule(ParameterSpace lrSchedule, ParameterSpace momentum){ - return new NesterovsSpace(null, lrSchedule, momentum, null); - } - - - @Override - public IUpdater getValue(double[] parameterValues) { - double lr = learningRate == null ? Nesterovs.DEFAULT_NESTEROV_LEARNING_RATE : learningRate.getValue(parameterValues); - ISchedule lrS = learningRateSchedule == null ? null : learningRateSchedule.getValue(parameterValues); - double m = momentum == null ? Nesterovs.DEFAULT_NESTEROV_MOMENTUM : momentum.getValue(parameterValues); - ISchedule mS = momentumSchedule == null ? null : momentumSchedule.getValue(parameterValues); - if(lrS == null){ - if(momentumSchedule == null){ - return new Nesterovs(lr, m); - } else { - return new Nesterovs(lr, mS); - } - } else { - if(momentumSchedule == null){ - return new Nesterovs(lrS, m); - } else { - return new Nesterovs(lrS, mS); - } - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/RmsPropSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/RmsPropSpace.java deleted file mode 100644 index f36a97edb..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/RmsPropSpace.java +++ /dev/null @@ -1,56 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.linalg.learning.config.RmsProp; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -@Data -@EqualsAndHashCode(callSuper = false) -public class RmsPropSpace extends BaseUpdaterSpace { - - protected ParameterSpace learningRate; - protected ParameterSpace learningRateSchedule; - - public RmsPropSpace(ParameterSpace learningRate) { - this(learningRate, null); - } - - public RmsPropSpace(@JsonProperty("learningRate") ParameterSpace learningRate, - @JsonProperty("learningRateSchedule") ParameterSpace learningRateSchedule){ - this.learningRate = learningRate; - this.learningRateSchedule = learningRateSchedule; - } - - @Override - public IUpdater getValue(double[] parameterValues) { - double lr = learningRate == null ? RmsProp.DEFAULT_RMSPROP_LEARNING_RATE : learningRate.getValue(parameterValues); - ISchedule lrS = learningRateSchedule == null ? null : learningRateSchedule.getValue(parameterValues); - if(lrS == null){ - return new RmsProp(lr); - } else { - return new RmsProp(lrS); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/SgdSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/SgdSpace.java deleted file mode 100644 index 3ea4ff63c..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/SgdSpace.java +++ /dev/null @@ -1,56 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.linalg.learning.config.Sgd; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -@Data -@EqualsAndHashCode(callSuper = false) -public class SgdSpace extends BaseUpdaterSpace { - - protected ParameterSpace learningRate; - protected ParameterSpace learningRateSchedule; - - public SgdSpace(ParameterSpace learningRate) { - this(learningRate, null); - } - - public SgdSpace(@JsonProperty("learningRate") ParameterSpace learningRate, - @JsonProperty("learningRateSchedule") ParameterSpace learningRateSchedule){ - this.learningRate = learningRate; - this.learningRateSchedule = learningRateSchedule; - } - - @Override - public IUpdater getValue(double[] parameterValues) { - double lr = learningRate == null ? Sgd.DEFAULT_SGD_LR : learningRate.getValue(parameterValues); - ISchedule lrS = learningRateSchedule == null ? null : learningRateSchedule.getValue(parameterValues); - if(lrS == null){ - return new Sgd(lr); - } else { - return new Sgd(lrS); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/ExponentialScheduleSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/ExponentialScheduleSpace.java deleted file mode 100644 index 418ed6dfd..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/ExponentialScheduleSpace.java +++ /dev/null @@ -1,94 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater.schedule; - -import lombok.Data; -import lombok.NoArgsConstructor; -import lombok.NonNull; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.nd4j.linalg.schedule.ExponentialSchedule; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.linalg.schedule.ScheduleType; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.util.*; - -@NoArgsConstructor //JSON -@Data -public class ExponentialScheduleSpace implements ParameterSpace { - - private ScheduleType scheduleType; - private ParameterSpace initialValue; - private ParameterSpace gamma; - - public ExponentialScheduleSpace(@NonNull ScheduleType scheduleType, - @NonNull ParameterSpace initialValue, double gamma){ - this(scheduleType, initialValue, new FixedValue<>(gamma)); - } - - public ExponentialScheduleSpace(@NonNull @JsonProperty("scheduleType") ScheduleType scheduleType, - @NonNull @JsonProperty("initialValue") ParameterSpace initialValue, - @NonNull @JsonProperty("gamma") ParameterSpace gamma){ - this.scheduleType = scheduleType; - this.initialValue = initialValue; - this.gamma = gamma; - } - - @Override - public ISchedule getValue(double[] parameterValues) { - return new ExponentialSchedule(scheduleType, initialValue.getValue(parameterValues), gamma.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return initialValue.numParameters() + gamma.numParameters(); - } - - @Override - public List collectLeaves() { - return Arrays.asList(initialValue, gamma); - } - - @Override - public Map getNestedSpaces() { - Map out = new LinkedHashMap<>(); - out.put("initialValue", initialValue); - out.put("gamma", gamma); - return out; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - if(initialValue.numParameters() > 0){ - int[] sub = Arrays.copyOfRange(indices, 0, initialValue.numParameters()); - initialValue.setIndices(sub); - } - if(gamma.numParameters() > 0){ - int inp = initialValue.numParameters(); - int[] sub = Arrays.copyOfRange(indices, inp, inp + gamma.numParameters()); - gamma.setIndices(sub); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/InverseScheduleSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/InverseScheduleSpace.java deleted file mode 100644 index 077208eee..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/InverseScheduleSpace.java +++ /dev/null @@ -1,107 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater.schedule; - -import lombok.Data; -import lombok.NoArgsConstructor; -import lombok.NonNull; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.linalg.schedule.InverseSchedule; -import org.nd4j.linalg.schedule.ScheduleType; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.util.Arrays; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Map; - -@NoArgsConstructor //JSON -@Data -public class InverseScheduleSpace implements ParameterSpace { - - private ScheduleType scheduleType; - private ParameterSpace initialValue; - private ParameterSpace gamma; - private ParameterSpace power; - - public InverseScheduleSpace(@NonNull ScheduleType scheduleType, @NonNull ParameterSpace initialValue, - double gamma, double power){ - this(scheduleType, initialValue, new FixedValue<>(gamma), new FixedValue<>(power)); - } - - public InverseScheduleSpace(@NonNull @JsonProperty("scheduleType") ScheduleType scheduleType, - @NonNull @JsonProperty("initialValue") ParameterSpace initialValue, - @NonNull @JsonProperty("gamma") ParameterSpace gamma, - @NonNull @JsonProperty("power") ParameterSpace power){ - this.scheduleType = scheduleType; - this.initialValue = initialValue; - this.gamma = gamma; - this.power = power; - } - - @Override - public ISchedule getValue(double[] parameterValues) { - return new InverseSchedule(scheduleType, initialValue.getValue(parameterValues), - gamma.getValue(parameterValues), power.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return initialValue.numParameters() + gamma.numParameters() + power.numParameters(); - } - - @Override - public List collectLeaves() { - return Arrays.asList(initialValue, gamma, power); - } - - @Override - public Map getNestedSpaces() { - Map out = new LinkedHashMap<>(); - out.put("initialValue", initialValue); - out.put("gamma", gamma); - out.put("power", power); - return out; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - if(initialValue.numParameters() > 0){ - int[] sub = Arrays.copyOfRange(indices, 0, initialValue.numParameters()); - initialValue.setIndices(sub); - } - if(gamma.numParameters() > 0){ - int inp = initialValue.numParameters(); - int[] sub = Arrays.copyOfRange(indices, inp, inp + gamma.numParameters()); - gamma.setIndices(sub); - } - if(power.numParameters() > 0){ - int np = initialValue.numParameters() + gamma.numParameters(); - int[] sub = Arrays.copyOfRange(indices, np, np + power.numParameters()); - power.setIndices(sub); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/PolyScheduleSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/PolyScheduleSpace.java deleted file mode 100644 index 886ff60fa..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/PolyScheduleSpace.java +++ /dev/null @@ -1,107 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater.schedule; - -import lombok.Data; -import lombok.NoArgsConstructor; -import lombok.NonNull; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.linalg.schedule.PolySchedule; -import org.nd4j.linalg.schedule.ScheduleType; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.util.Arrays; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Map; - -@NoArgsConstructor //JSON -@Data -public class PolyScheduleSpace implements ParameterSpace { - - private ScheduleType scheduleType; - private ParameterSpace initialValue; - private ParameterSpace power; - private ParameterSpace maxIter; - - public PolyScheduleSpace(@NonNull ScheduleType scheduleType, @NonNull ParameterSpace initialValue, - double power, int maxIter){ - this(scheduleType, initialValue, new FixedValue<>(power), new FixedValue<>(maxIter)); - } - - public PolyScheduleSpace(@NonNull @JsonProperty("scheduleType") ScheduleType scheduleType, - @NonNull @JsonProperty("initialValue") ParameterSpace initialValue, - @NonNull @JsonProperty("power") ParameterSpace power, - @NonNull @JsonProperty("maxIter") ParameterSpace maxIter){ - this.scheduleType = scheduleType; - this.initialValue = initialValue; - this.power = power; - this.maxIter = maxIter; - } - - @Override - public ISchedule getValue(double[] parameterValues) { - return new PolySchedule(scheduleType, initialValue.getValue(parameterValues), - power.getValue(parameterValues), maxIter.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return initialValue.numParameters() + power.numParameters() + maxIter.numParameters(); - } - - @Override - public List collectLeaves() { - return Arrays.asList(initialValue, power, maxIter); - } - - @Override - public Map getNestedSpaces() { - Map out = new LinkedHashMap<>(); - out.put("initialValue", initialValue); - out.put("power", power); - out.put("maxIter", maxIter); - return out; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - if(initialValue.numParameters() > 0){ - int[] sub = Arrays.copyOfRange(indices, 0, initialValue.numParameters()); - initialValue.setIndices(sub); - } - if(power.numParameters() > 0){ - int np = initialValue.numParameters(); - int[] sub = Arrays.copyOfRange(indices, np, np + power.numParameters()); - power.setIndices(sub); - } - if(maxIter.numParameters() > 0){ - int np = initialValue.numParameters() + power.numParameters(); - int[] sub = Arrays.copyOfRange(indices, np, np + maxIter.numParameters()); - maxIter.setIndices(sub); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/SigmoidScheduleSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/SigmoidScheduleSpace.java deleted file mode 100644 index 9476125eb..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/SigmoidScheduleSpace.java +++ /dev/null @@ -1,107 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater.schedule; - -import lombok.Data; -import lombok.NoArgsConstructor; -import lombok.NonNull; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.linalg.schedule.ScheduleType; -import org.nd4j.linalg.schedule.SigmoidSchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.util.Arrays; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Map; - -@NoArgsConstructor //JSON -@Data -public class SigmoidScheduleSpace implements ParameterSpace { - - private ScheduleType scheduleType; - private ParameterSpace initialValue; - private ParameterSpace gamma; - private ParameterSpace stepSize; - - public SigmoidScheduleSpace(@NonNull ScheduleType scheduleType, @NonNull ParameterSpace initialValue, - double gamma, int stepSize){ - this(scheduleType, initialValue, new FixedValue<>(gamma), new FixedValue<>(stepSize)); - } - - public SigmoidScheduleSpace(@NonNull @JsonProperty("scheduleType") ScheduleType scheduleType, - @NonNull @JsonProperty("initialValue") ParameterSpace initialValue, - @NonNull @JsonProperty("gamma") ParameterSpace gamma, - @NonNull @JsonProperty("stepSize") ParameterSpace stepSize){ - this.scheduleType = scheduleType; - this.initialValue = initialValue; - this.gamma = gamma; - this.stepSize = stepSize; - } - - @Override - public ISchedule getValue(double[] parameterValues) { - return new SigmoidSchedule(scheduleType, initialValue.getValue(parameterValues), - gamma.getValue(parameterValues), stepSize.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return initialValue.numParameters() + gamma.numParameters() + stepSize.numParameters(); - } - - @Override - public List collectLeaves() { - return Arrays.asList(initialValue, gamma, stepSize); - } - - @Override - public Map getNestedSpaces() { - Map out = new LinkedHashMap<>(); - out.put("initialValue", initialValue); - out.put("gamma", gamma); - out.put("stepSize", stepSize); - return out; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - if(initialValue.numParameters() > 0){ - int[] sub = Arrays.copyOfRange(indices, 0, initialValue.numParameters()); - initialValue.setIndices(sub); - } - if(gamma.numParameters() > 0){ - int np = initialValue.numParameters(); - int[] sub = Arrays.copyOfRange(indices, np, np + gamma.numParameters()); - gamma.setIndices(sub); - } - if(stepSize.numParameters() > 0){ - int np = initialValue.numParameters() + gamma.numParameters(); - int[] sub = Arrays.copyOfRange(indices, np, np + stepSize.numParameters()); - stepSize.setIndices(sub); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/StepScheduleSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/StepScheduleSpace.java deleted file mode 100644 index 925d1fc5e..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/conf/updater/schedule/StepScheduleSpace.java +++ /dev/null @@ -1,107 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.conf.updater.schedule; - -import lombok.Data; -import lombok.NoArgsConstructor; -import lombok.NonNull; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.nd4j.linalg.schedule.ISchedule; -import org.nd4j.linalg.schedule.ScheduleType; -import org.nd4j.linalg.schedule.StepSchedule; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.util.Arrays; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Map; - -@NoArgsConstructor //JSON -@Data -public class StepScheduleSpace implements ParameterSpace { - - private ScheduleType scheduleType; - private ParameterSpace initialValue; - private ParameterSpace decayRate; - private ParameterSpace step; - - public StepScheduleSpace(@NonNull ScheduleType scheduleType, @NonNull ParameterSpace initialValue, - double decayRate, double step){ - this(scheduleType, initialValue, new FixedValue<>(decayRate), new FixedValue<>(step)); - } - - public StepScheduleSpace(@NonNull @JsonProperty("scheduleType") ScheduleType scheduleType, - @NonNull @JsonProperty("initialValue") ParameterSpace initialValue, - @NonNull @JsonProperty("decayRate") ParameterSpace decayRate, - @NonNull @JsonProperty("step") ParameterSpace step){ - this.scheduleType = scheduleType; - this.initialValue = initialValue; - this.decayRate = decayRate; - this.step = step; - } - - @Override - public ISchedule getValue(double[] parameterValues) { - return new StepSchedule(scheduleType, initialValue.getValue(parameterValues), - decayRate.getValue(parameterValues), step.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return initialValue.numParameters() + decayRate.numParameters() + step.numParameters(); - } - - @Override - public List collectLeaves() { - return Arrays.asList(initialValue, decayRate, step); - } - - @Override - public Map getNestedSpaces() { - Map out = new LinkedHashMap<>(); - out.put("initialValue", initialValue); - out.put("decayRate", decayRate); - out.put("step", step); - return out; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - if(initialValue.numParameters() > 0){ - int[] sub = Arrays.copyOfRange(indices, 0, initialValue.numParameters()); - initialValue.setIndices(sub); - } - if(decayRate.numParameters() > 0){ - int inp = initialValue.numParameters(); - int[] sub = Arrays.copyOfRange(indices, inp, inp + decayRate.numParameters()); - decayRate.setIndices(sub); - } - if(step.numParameters() > 0){ - int np = initialValue.numParameters() + decayRate.numParameters(); - int[] sub = Arrays.copyOfRange(indices, np, np + step.numParameters()); - step.setIndices(sub); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/data/DataSetIteratorFactoryProvider.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/data/DataSetIteratorFactoryProvider.java deleted file mode 100644 index 6868eb022..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/data/DataSetIteratorFactoryProvider.java +++ /dev/null @@ -1,87 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.data; - -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.nd4j.linalg.dataset.api.iterator.DataSetIteratorFactory; - -import java.util.Map; - -/** - * This is a {@link DataProvider} for - * an {@link DataSetIteratorFactory} which - * based on a key of {@link DataSetIteratorFactoryProvider#FACTORY_KEY} - * will create {@link org.nd4j.linalg.dataset.api.iterator.DataSetIterator} - * for use with arbiter. - * - * This {@link DataProvider} is mainly meant for use for command line driven - * applications. - * - * @author Adam Gibson - */ -public class DataSetIteratorFactoryProvider implements DataProvider { - - public final static String FACTORY_KEY = "org.deeplearning4j.arbiter.data.data.factory"; - - /** - * Get training data given some parameters for the data. - * Data parameters map is used to specify things like batch - * size data preprocessing - * - * @param dataParameters Parameters for data. May be null or empty for default data - * @return training data - */ - @Override - public DataSetIteratorFactory trainData(Map dataParameters) { - return create(dataParameters); - } - - /** - * Get training data given some parameters for the data. Data parameters map - * is used to specify things like batch - * size data preprocessing - * - * @param dataParameters Parameters for data. May be null or empty for default data - * @return training data - */ - @Override - public DataSetIteratorFactory testData(Map dataParameters) { - return create(dataParameters); - } - - @Override - public Class getDataType() { - return DataSetIteratorFactory.class; - } - - private DataSetIteratorFactory create(Map dataParameters) { - if (!dataParameters.containsKey(FACTORY_KEY)) - throw new IllegalArgumentException( - "No data set iterator factory class found. Please specify a class name with key " - + FACTORY_KEY); - String value = dataParameters.get(FACTORY_KEY).toString(); - try { - Class clazz = - (Class) Class.forName(value); - return clazz.newInstance(); - } catch (Exception e) { - throw new RuntimeException("Could not create DataSetIteratorFactory instance - missing no-arg constructor?", e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/data/MnistDataProvider.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/data/MnistDataProvider.java deleted file mode 100644 index c7cb54602..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/data/MnistDataProvider.java +++ /dev/null @@ -1,82 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.data; - -import lombok.Data; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultipleEpochsIterator; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.io.IOException; -import java.util.Map; -import java.util.Random; - -/** - * - * MnistDataProvider - a DataProvider for the MNIST data set, with configurable number of epochs, batch size - * and RNG seed - * - * @author Alex Black - */ -@Data -@NoArgsConstructor -public class MnistDataProvider implements DataProvider{ - - private int numEpochs; - private int batchSize; - private int rngSeed; - - public MnistDataProvider(int numEpochs, int batchSize){ - this(numEpochs, batchSize, new Random().nextInt()); - } - - public MnistDataProvider(@JsonProperty("numEpochs") int numEpochs, @JsonProperty("batchSize") int batchSize, - @JsonProperty("rngSeed") int rngSeed) { - this.numEpochs = numEpochs; - this.batchSize = batchSize; - this.rngSeed = rngSeed; - } - - - @Override - public Object trainData(Map dataParameters) { - try { - return new MultipleEpochsIterator(numEpochs, new MnistDataSetIterator(batchSize, true, rngSeed)); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - @Override - public Object testData(Map dataParameters) { - try { - return new MnistDataSetIterator(batchSize, false, 12345); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/AlphaDropoutSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/AlphaDropoutSpace.java deleted file mode 100644 index 7acd8dd66..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/AlphaDropoutSpace.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.dropout; - -import lombok.AllArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.conf.dropout.AlphaDropout; -import org.deeplearning4j.nn.conf.dropout.IDropout; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -@AllArgsConstructor -public class AlphaDropoutSpace implements ParameterSpace { - - private ParameterSpace dropout; - - public AlphaDropoutSpace(double activationRetainProbability){ - this(new FixedValue<>(activationRetainProbability)); - } - - @Override - public IDropout getValue(double[] parameterValues) { - return new AlphaDropout(dropout.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return dropout.numParameters(); - } - - @Override - public List collectLeaves() { - return Collections.singletonList(dropout); - } - - @Override - public Map getNestedSpaces() { - return Collections.singletonMap("dropout", dropout); - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - dropout.setIndices(indices); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/DropoutSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/DropoutSpace.java deleted file mode 100644 index 57f859a6c..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/DropoutSpace.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.dropout; - -import lombok.AllArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.conf.dropout.Dropout; -import org.deeplearning4j.nn.conf.dropout.IDropout; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -@AllArgsConstructor -public class DropoutSpace implements ParameterSpace { - - private ParameterSpace dropout; - - public DropoutSpace(double activationRetainProbability){ - this(new FixedValue<>(activationRetainProbability)); - } - - @Override - public IDropout getValue(double[] parameterValues) { - return new Dropout(dropout.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return dropout.numParameters(); - } - - @Override - public List collectLeaves() { - return Collections.singletonList(dropout); - } - - @Override - public Map getNestedSpaces() { - return Collections.singletonMap("dropout", dropout); - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - dropout.setIndices(indices); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/GaussianDropoutSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/GaussianDropoutSpace.java deleted file mode 100644 index 1f7e490c3..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/GaussianDropoutSpace.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.dropout; - -import lombok.AllArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.conf.dropout.GaussianDropout; -import org.deeplearning4j.nn.conf.dropout.IDropout; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -@AllArgsConstructor -public class GaussianDropoutSpace implements ParameterSpace { - - private ParameterSpace rate; - - public GaussianDropoutSpace(double rate){ - this(new FixedValue<>(rate)); - } - - @Override - public IDropout getValue(double[] parameterValues) { - return new GaussianDropout(rate.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return rate.numParameters(); - } - - @Override - public List collectLeaves() { - return Collections.singletonList(rate); - } - - @Override - public Map getNestedSpaces() { - return Collections.singletonMap("rate", rate); - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - rate.setIndices(indices); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/GaussianNoiseSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/GaussianNoiseSpace.java deleted file mode 100644 index 11eae8dbe..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/dropout/GaussianNoiseSpace.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.dropout; - -import lombok.AllArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.conf.dropout.GaussianNoise; -import org.deeplearning4j.nn.conf.dropout.IDropout; - -import java.util.Collections; -import java.util.List; -import java.util.Map; - -@AllArgsConstructor -public class GaussianNoiseSpace implements ParameterSpace { - - private ParameterSpace stddev; - - public GaussianNoiseSpace(double stddev){ - this(new FixedValue<>(stddev)); - } - - @Override - public IDropout getValue(double[] parameterValues) { - return new GaussianNoise(stddev.getValue(parameterValues)); - } - - @Override - public int numParameters() { - return stddev.numParameters(); - } - - @Override - public List collectLeaves() { - return Collections.singletonList(stddev); - } - - @Override - public Map getNestedSpaces() { - return Collections.singletonMap("stddev", stddev); - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - stddev.setIndices(indices); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/evaluator/multilayer/ClassificationEvaluator.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/evaluator/multilayer/ClassificationEvaluator.java deleted file mode 100644 index 63608a7b8..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/evaluator/multilayer/ClassificationEvaluator.java +++ /dev/null @@ -1,70 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.evaluator.multilayer; - -import lombok.AllArgsConstructor; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.evaluation.ModelEvaluator; -import org.deeplearning4j.arbiter.scoring.util.ScoreUtil; -import org.deeplearning4j.eval.Evaluation; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -import java.util.Arrays; -import java.util.List; -import java.util.Map; - -/** - * A model evaluator for doing additional - * evaluation (classification evaluation) - * for a {@link MultiLayerNetwork} given a {@link DataSetIterator} - * - * @author Alex Black - */ -@NoArgsConstructor -@AllArgsConstructor -public class ClassificationEvaluator implements ModelEvaluator { - private Map params = null; - - - @Override - public Evaluation evaluateModel(Object model, DataProvider dataProvider) { - - if (model instanceof MultiLayerNetwork) { - DataSetIterator iterator = ScoreUtil.getIterator(dataProvider.testData(params)); - return ScoreUtil.getEvaluation((MultiLayerNetwork) model, iterator); - } else { - DataSetIterator iterator = ScoreUtil.getIterator(dataProvider.testData(params)); - return ScoreUtil.getEvaluation((ComputationGraph) model, iterator); - } - } - - @Override - public List> getSupportedModelTypes() { - return Arrays.>asList(MultiLayerNetwork.class, ComputationGraph.class); - } - - @Override - public List> getSupportedDataTypes() { - return Arrays.>asList(DataSetIterator.class, MultiDataSetIterator.class); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/evaluator/multilayer/RegressionDataEvaluator.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/evaluator/multilayer/RegressionDataEvaluator.java deleted file mode 100644 index c35976683..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/evaluator/multilayer/RegressionDataEvaluator.java +++ /dev/null @@ -1,64 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.evaluator.multilayer; - -import lombok.AllArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.evaluation.ModelEvaluator; -import org.deeplearning4j.arbiter.scoring.RegressionValue; -import org.deeplearning4j.arbiter.scoring.util.ScoreUtil; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -import java.util.Arrays; -import java.util.List; -import java.util.Map; - -/** - * Created by agibsonccc on 3/12/17. - */ -@AllArgsConstructor -public class RegressionDataEvaluator implements ModelEvaluator { - private RegressionValue regressionValue; - private Map params = null; - - @Override - public Double evaluateModel(Object model, DataProvider dataProvider) { - - if (model instanceof MultiLayerNetwork) { - DataSetIterator iterator = ScoreUtil.getIterator(dataProvider.testData(params)); - return ScoreUtil.score((MultiLayerNetwork) model, iterator, regressionValue); - } else { - DataSetIterator iterator = ScoreUtil.getIterator(dataProvider.testData(params)); - return ScoreUtil.score((ComputationGraph) model, iterator, regressionValue); - } - } - - @Override - public List> getSupportedModelTypes() { - return Arrays.>asList(MultiLayerNetwork.class, ComputationGraph.class); - } - - @Override - public List> getSupportedDataTypes() { - return Arrays.>asList(DataSetIterator.class, MultiDataSetIterator.class); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/AbstractLSTMLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/AbstractLSTMLayerSpace.java deleted file mode 100644 index cddc7c85c..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/AbstractLSTMLayerSpace.java +++ /dev/null @@ -1,109 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.adapter.ActivationParameterSpaceAdapter; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.AbstractLSTM; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; - -/** - * Layer space for LSTM layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public abstract class AbstractLSTMLayerSpace extends FeedForwardLayerSpace { - - protected ParameterSpace forgetGateBiasInit; - protected ParameterSpace gateActivationFn; - - protected AbstractLSTMLayerSpace(Builder builder) { - super(builder); - this.forgetGateBiasInit = builder.forgetGateBiasInit; - this.gateActivationFn = builder.gateActivationFn; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - protected void setLayerOptionsBuilder(AbstractLSTM.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (forgetGateBiasInit != null) - builder.forgetGateBiasInit(forgetGateBiasInit.getValue(values)); - if(gateActivationFn != null) - builder.gateActivationFunction(gateActivationFn.getValue(values)); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - StringBuilder sb = new StringBuilder(); //"AbstractLSTMLayerSpace("); - if (forgetGateBiasInit != null) - sb.append("forgetGateBiasInit: ").append(forgetGateBiasInit).append(delim); - if (gateActivationFn != null) - sb.append("gateActivationFn: ").append(gateActivationFn).append(delim); - sb.append(super.toString(delim)); - return sb.toString(); - } - - public static abstract class Builder extends FeedForwardLayerSpace.Builder { - - private ParameterSpace forgetGateBiasInit; - private ParameterSpace gateActivationFn; - - public T forgetGateBiasInit(double forgetGateBiasInit) { - return forgetGateBiasInit(new FixedValue<>(forgetGateBiasInit)); - } - - public T forgetGateBiasInit(ParameterSpace forgetGateBiasInit) { - this.forgetGateBiasInit = forgetGateBiasInit; - return (T)this; - } - - public T gateActivationFn(Activation activation){ - return gateActivationFn(activation.getActivationFunction()); - } - - public T gateActivation(ParameterSpace gateActivationFn){ - return gateActivationFn(new ActivationParameterSpaceAdapter(gateActivationFn)); - } - - public T gateActivationFn(IActivation gateActivationFn){ - return gateActivationFn(new FixedValue<>(gateActivationFn)); - } - - public T gateActivationFn(ParameterSpace gateActivationFn){ - this.gateActivationFn = gateActivationFn; - return (T)this; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/ActivationLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/ActivationLayerSpace.java deleted file mode 100644 index 3d72090c3..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/ActivationLayerSpace.java +++ /dev/null @@ -1,96 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.adapter.ActivationParameterSpaceAdapter; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.ActivationLayer; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; - -/** - * Layer space for {@link ActivationLayer} - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class ActivationLayerSpace extends LayerSpace { - - private ParameterSpace activationFunction; - - protected ActivationLayerSpace(Builder builder) { - super(builder); - this.activationFunction = builder.activationFunction; - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - - @Override - public ActivationLayer getValue(double[] parameterValues) { - ActivationLayer.Builder b = new ActivationLayer.Builder(); - super.setLayerOptionsBuilder(b, parameterValues); - b.activation(activationFunction.getValue(parameterValues)); - return b.build(); - } - - public static class Builder extends LayerSpace.Builder { - - private ParameterSpace activationFunction; - - public Builder activation(Activation activation) { - return activation(new FixedValue<>(activation)); - } - - public Builder activation(IActivation iActivation) { - return activationFn(new FixedValue<>(iActivation)); - } - - public Builder activation(ParameterSpace activationFunction) { - return activationFn(new ActivationParameterSpaceAdapter(activationFunction)); - } - - public Builder activationFn(ParameterSpace activationFunction) { - this.activationFunction = activationFunction; - return this; - } - - @SuppressWarnings("unchecked") - public ActivationLayerSpace build() { - return new ActivationLayerSpace(this); - } - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - return "ActivationLayerSpace(" + super.toString(delim) + ")"; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/AutoEncoderLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/AutoEncoderLayerSpace.java deleted file mode 100644 index 6682e6bc9..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/AutoEncoderLayerSpace.java +++ /dev/null @@ -1,109 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.AutoEncoder; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -/** - * Layer space for autoencoder layers - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class AutoEncoderLayerSpace extends BasePretrainNetworkLayerSpace { - @JsonProperty - private ParameterSpace corruptionLevel; - @JsonProperty - private ParameterSpace sparsity; - - private AutoEncoderLayerSpace(Builder builder) { - super(builder); - this.corruptionLevel = builder.corruptionLevel; - this.sparsity = builder.sparsity; - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public AutoEncoder getValue(double[] values) { - AutoEncoder.Builder b = new AutoEncoder.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(AutoEncoder.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (corruptionLevel != null) - builder.corruptionLevel(corruptionLevel.getValue(values)); - if (sparsity != null) - builder.sparsity(sparsity.getValue(values)); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - StringBuilder sb = new StringBuilder("AutoEncoderLayerSpace("); - if (corruptionLevel != null) - sb.append("corruptionLevel: ").append(corruptionLevel).append(delim); - if (sparsity != null) - sb.append("sparsity: ").append(sparsity).append(delim); - sb.append(super.toString(delim)).append(")"); - return sb.toString(); - } - - public static class Builder extends BasePretrainNetworkLayerSpace.Builder { - - private ParameterSpace corruptionLevel; - private ParameterSpace sparsity; - - public Builder corruptionLevel(double corruptionLevel) { - return corruptionLevel(new FixedValue<>(corruptionLevel)); - } - - public Builder corruptionLevel(ParameterSpace corruptionLevel) { - this.corruptionLevel = corruptionLevel; - return this; - } - - public Builder sparsity(double sparsity) { - return sparsity(new FixedValue<>(sparsity)); - } - - public Builder sparsity(ParameterSpace sparsity) { - this.sparsity = sparsity; - return this; - } - - public AutoEncoderLayerSpace build() { - return new AutoEncoderLayerSpace(this); - } - - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseConvolutionLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseConvolutionLayerSpace.java deleted file mode 100644 index e8f23fbd3..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseConvolutionLayerSpace.java +++ /dev/null @@ -1,164 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.ConvolutionMode; -import org.deeplearning4j.nn.conf.layers.ConvolutionLayer; -import org.deeplearning4j.nn.conf.layers.FeedForwardLayer; - -/** - * Layer space for convolutional layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public abstract class BaseConvolutionLayerSpace extends FeedForwardLayerSpace { - protected ParameterSpace dilation; - protected ParameterSpace kernelSize; - protected ParameterSpace stride; - protected ParameterSpace padding; - protected ParameterSpace convolutionMode; - protected ParameterSpace hasBias; - - protected BaseConvolutionLayerSpace(Builder builder) { - super(builder); - this.dilation = builder.dilation; - this.kernelSize = builder.kernelSize; - this.stride = builder.stride; - this.padding = builder.padding; - this.convolutionMode = builder.convolutionMode; - this.hasBias = builder.hasBias; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - protected void setLayerOptionsBuilder(ConvolutionLayer.BaseConvBuilder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (dilation != null) - builder.dilation(dilation.getValue(values)); - if (kernelSize != null) - builder.kernelSize(kernelSize.getValue(values)); - if (stride != null) - builder.stride(stride.getValue(values)); - if (padding != null) - builder.padding(padding.getValue(values)); - if (convolutionMode != null) - builder.convolutionMode(convolutionMode.getValue(values)); - if (hasBias != null) - builder.hasBias(hasBias.getValue(values)); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - StringBuilder sb = new StringBuilder(); - if (dilation != null) - sb.append("dilation: ").append(dilation).append(delim); - if (kernelSize != null) - sb.append("kernelSize: ").append(kernelSize).append(delim); - if (stride != null) - sb.append("stride: ").append(stride).append(delim); - if (padding != null) - sb.append("padding: ").append(padding).append(delim); - if (convolutionMode != null) - sb.append("convolutionMode: ").append(convolutionMode).append(delim); - if (hasBias != null) - sb.append("hasBias: ").append(hasBias).append(delim); - sb.append(super.toString(delim)); - return sb.toString(); - } - - - public static abstract class Builder extends FeedForwardLayerSpace.Builder { - protected ParameterSpace dilation; - protected ParameterSpace kernelSize; - protected ParameterSpace stride; - protected ParameterSpace padding; - protected ParameterSpace convolutionMode; - protected ParameterSpace hasBias; - - public T dilation(int... dilation) { - return dilation(new FixedValue<>(dilation)); - } - - public T dilation(ParameterSpace dilation) { - this.dilation = dilation; - return (T) this; - } - public T kernelSize(int... kernelSize) { - return kernelSize(new FixedValue<>(kernelSize)); - } - - public T kernelSize(ParameterSpace kernelSize) { - this.kernelSize = kernelSize; - return (T)this; - } - - public T stride(int... stride) { - return stride(new FixedValue<>(stride)); - } - - public T stride(ParameterSpace stride) { - this.stride = stride; - return (T)this; - } - - public T padding(int... padding) { - return padding(new FixedValue<>(padding)); - } - - public T padding(ParameterSpace padding) { - this.padding = padding; - return (T)this; - } - - public T convolutionMode(ConvolutionMode convolutionMode) { - return convolutionMode(new FixedValue<>(convolutionMode)); - } - - public T convolutionMode(ParameterSpace convolutionMode) { - this.convolutionMode = convolutionMode; - return (T)this; - } - - public T hasBias(boolean hasBias){ - return hasBias(new FixedValue<>(hasBias)); - } - - public T hasBias(ParameterSpace hasBias){ - this.hasBias = hasBias; - return (T)this; - } - - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseLayerSpace.java deleted file mode 100644 index ad4bc2c32..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseLayerSpace.java +++ /dev/null @@ -1,293 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import org.nd4j.shade.guava.base.Preconditions; -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.adapter.ActivationParameterSpaceAdapter; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.nn.conf.GradientNormalization; -import org.deeplearning4j.nn.conf.distribution.Distribution; -import org.deeplearning4j.nn.conf.layers.BaseLayer; -import org.deeplearning4j.nn.conf.weightnoise.IWeightNoise; -import org.deeplearning4j.nn.weights.WeightInit; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; -import org.nd4j.linalg.learning.config.IUpdater; -import org.nd4j.shade.jackson.annotation.JsonInclude; - -import java.util.Map; - -/** - * BaseLayerSpace contains the common Layer hyperparameters; should match {@link BaseLayer} in terms of features - * - * @author Alex Black - */ -@JsonInclude(JsonInclude.Include.NON_NULL) - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public abstract class BaseLayerSpace extends LayerSpace { - protected ParameterSpace activationFunction; - protected ParameterSpace weightInit; - protected ParameterSpace biasInit; - protected ParameterSpace dist; - protected ParameterSpace l1; - protected ParameterSpace l2; - protected ParameterSpace l1Bias; - protected ParameterSpace l2Bias; - protected ParameterSpace updater; - protected ParameterSpace biasUpdater; - protected ParameterSpace weightNoise; - protected ParameterSpace gradientNormalization; - protected ParameterSpace gradientNormalizationThreshold; - protected int numParameters; - - @SuppressWarnings("unchecked") - protected BaseLayerSpace(Builder builder) { - super(builder); - this.activationFunction = builder.activationFunction; - this.weightInit = builder.weightInit; - this.biasInit = builder.biasInit; - this.dist = builder.dist; - this.l1 = builder.l1; - this.l2 = builder.l2; - this.l1Bias = builder.l1Bias; - this.l2Bias = builder.l2Bias; - this.updater = builder.updater; - this.biasUpdater = builder.biasUpdater; - this.weightNoise = builder.weightNoise; - this.gradientNormalization = builder.gradientNormalization; - this.gradientNormalizationThreshold = builder.gradientNormalizationThreshold; - } - - @Override - public int numParameters() { - return numParameters; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - throw new UnsupportedOperationException("Cannot set indices for non-leaf parameter space"); - } - - - protected void setLayerOptionsBuilder(BaseLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (activationFunction != null) - builder.activation(activationFunction.getValue(values)); - if (biasInit != null) - builder.biasInit(biasInit.getValue(values)); - if (weightInit != null) - builder.weightInit(weightInit.getValue(values)); - if (dist != null) - builder.dist(dist.getValue(values)); - if (l1 != null) - builder.l1(l1.getValue(values)); - if (l2 != null) - builder.l2(l2.getValue(values)); - if (l1Bias != null) - builder.l1Bias(l1Bias.getValue(values)); - if (l2Bias != null) - builder.l2Bias(l2Bias.getValue(values)); - if (updater != null) - builder.updater(updater.getValue(values)); - if (biasUpdater != null) - builder.biasUpdater(biasUpdater.getValue(values)); - if (weightNoise != null) - builder.weightNoise(weightNoise.getValue(values)); - if (gradientNormalization != null) - builder.gradientNormalization(gradientNormalization.getValue(values)); - if (gradientNormalizationThreshold != null) - builder.gradientNormalizationThreshold(gradientNormalizationThreshold.getValue(values)); - } - - - @Override - public String toString() { - return toString(", "); - } - - protected String toString(String delim) { - StringBuilder sb = new StringBuilder(); - - for (Map.Entry e : getNestedSpaces().entrySet()) { - sb.append(e.getKey()).append(": ").append(e.getValue()).append("\n"); - } - return sb.toString(); - } - - @SuppressWarnings("unchecked") - public abstract static class Builder extends LayerSpace.Builder { - protected ParameterSpace activationFunction; - protected ParameterSpace weightInit; - protected ParameterSpace biasInit; - protected ParameterSpace dist; - protected ParameterSpace l1; - protected ParameterSpace l2; - protected ParameterSpace l1Bias; - protected ParameterSpace l2Bias; - protected ParameterSpace updater; - protected ParameterSpace biasUpdater; - protected ParameterSpace weightNoise; - protected ParameterSpace gradientNormalization; - protected ParameterSpace gradientNormalizationThreshold; - - public T activation(Activation... activations){ - Preconditions.checkArgument(activations.length > 0, "Activations length must be 1 or more"); - if(activations.length == 1){ - return activation(activations[0]); - } - return activation(new DiscreteParameterSpace<>(activations)); - } - - public T activation(Activation activation) { - return activation(new FixedValue<>(activation)); - } - - public T activation(IActivation iActivation) { - return activationFn(new FixedValue<>(iActivation)); - } - - public T activation(ParameterSpace activationFunction) { - return activationFn(new ActivationParameterSpaceAdapter(activationFunction)); - } - - public T activationFn(ParameterSpace activationFunction) { - this.activationFunction = activationFunction; - return (T) this; - } - - public T weightInit(WeightInit weightInit) { - return (T) weightInit(new FixedValue(weightInit)); - } - - public T weightInit(ParameterSpace weightInit) { - this.weightInit = weightInit; - return (T) this; - } - - public T weightInit(Distribution distribution){ - weightInit(WeightInit.DISTRIBUTION); - return dist(distribution); - } - - public T biasInit(double biasInit){ - return biasInit(new FixedValue<>(biasInit)); - } - - public T biasInit(ParameterSpace biasInit){ - this.biasInit = biasInit; - return (T) this; - } - - public T dist(Distribution dist) { - return dist(new FixedValue<>(dist)); - } - - public T dist(ParameterSpace dist) { - this.dist = dist; - return (T) this; - } - - public T l1(double l1) { - return l1(new FixedValue(l1)); - } - - public T l1(ParameterSpace l1) { - this.l1 = l1; - return (T) this; - } - - public T l2(double l2) { - return l2(new FixedValue(l2)); - } - - public T l2(ParameterSpace l2) { - this.l2 = l2; - return (T) this; - } - - public T l1Bias(double l1Bias) { - return l1Bias(new FixedValue(l1Bias)); - } - - public T l1Bias(ParameterSpace l1Bias) { - this.l1Bias = l1Bias; - return (T) this; - } - - public T l2Bias(double l2Bias) { - return l2Bias(new FixedValue<>(l2Bias)); - } - - public T l2Bias(ParameterSpace l2Bias) { - this.l2Bias = l2Bias; - return (T) this; - } - - public T updater(IUpdater updater) { - return updater(new FixedValue<>(updater)); - } - - public T updater(ParameterSpace updater) { - this.updater = updater; - return (T) this; - } - - public T biasUpdater(IUpdater biasUpdater) { - return biasUpdater(new FixedValue<>(biasUpdater)); - } - - public T biasUpdater(ParameterSpace biasUpdater) { - this.biasUpdater = biasUpdater; - return (T) this; - } - - public T gradientNormalization(GradientNormalization gradientNormalization) { - return gradientNormalization(new FixedValue(gradientNormalization)); - } - - public T gradientNormalization(ParameterSpace gradientNormalization) { - this.gradientNormalization = gradientNormalization; - return (T) this; - } - - public T gradientNormalizationThreshold(double threshold) { - return gradientNormalizationThreshold(new FixedValue<>(threshold)); - } - - public T gradientNormalizationThreshold(ParameterSpace gradientNormalizationThreshold) { - this.gradientNormalizationThreshold = gradientNormalizationThreshold; - return (T) this; - } - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseOutputLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseOutputLayerSpace.java deleted file mode 100644 index b63669115..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BaseOutputLayerSpace.java +++ /dev/null @@ -1,89 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.adapter.LossFunctionParameterSpaceAdapter; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.conf.layers.BaseOutputLayer; -import org.nd4j.linalg.lossfunctions.ILossFunction; -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction; - -/** - * @param Type of the (concrete) output layer - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PUBLIC) //For Jackson JSON/YAML deserialization -public abstract class BaseOutputLayerSpace extends FeedForwardLayerSpace { - - protected ParameterSpace lossFunction; - protected ParameterSpace hasBias; - - protected BaseOutputLayerSpace(Builder builder) { - super(builder); - this.lossFunction = builder.lossFunction; - this.hasBias = builder.hasBias; - } - - protected void setLayerOptionsBuilder(BaseOutputLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (lossFunction != null) - builder.lossFunction(lossFunction.getValue(values)); - if (hasBias != null) - builder.hasBias(hasBias.getValue(values)); - } - - @SuppressWarnings("unchecked") - public static abstract class Builder extends FeedForwardLayerSpace.Builder { - - protected ParameterSpace lossFunction; - protected ParameterSpace hasBias; - - public T lossFunction(LossFunction lossFunction) { - return lossFunction(new FixedValue<>(lossFunction)); - } - - public T lossFunction(ParameterSpace lossFunction) { - return iLossFunction(new LossFunctionParameterSpaceAdapter(lossFunction)); - } - - public T iLossFunction(ILossFunction lossFunction) { - return iLossFunction(new FixedValue<>(lossFunction)); - } - - public T iLossFunction(ParameterSpace lossFunction) { - this.lossFunction = lossFunction; - return (T) this; - } - - public T hasBias(boolean hasBias){ - return hasBias(new FixedValue<>(hasBias)); - } - - public T hasBias(ParameterSpace hasBias){ - this.hasBias = hasBias; - return (T)this; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BasePretrainNetworkLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BasePretrainNetworkLayerSpace.java deleted file mode 100644 index ea183e33c..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BasePretrainNetworkLayerSpace.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.conf.layers.BasePretrainNetwork; -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction; -import org.nd4j.shade.jackson.annotation.JsonProperty; - - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public abstract class BasePretrainNetworkLayerSpace extends FeedForwardLayerSpace { - @JsonProperty - protected ParameterSpace lossFunction; - - protected BasePretrainNetworkLayerSpace(Builder builder) { - super(builder); - this.lossFunction = builder.lossFunction; - } - - - public static abstract class Builder extends FeedForwardLayerSpace.Builder { - protected ParameterSpace lossFunction; - - public T lossFunction(LossFunction lossFunction) { - return lossFunction(new FixedValue(lossFunction)); - } - - public T lossFunction(ParameterSpace lossFunction) { - this.lossFunction = lossFunction; - return (T) this; - } - - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BatchNormalizationSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BatchNormalizationSpace.java deleted file mode 100644 index 88bd10a59..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/BatchNormalizationSpace.java +++ /dev/null @@ -1,216 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.api.layers.LayerConstraint; -import org.deeplearning4j.nn.conf.layers.BatchNormalization; - -import java.util.Arrays; -import java.util.List; - -/** - * LayerSpace for batch normalization layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class BatchNormalizationSpace extends FeedForwardLayerSpace { - - protected ParameterSpace decay; - protected ParameterSpace eps; - protected ParameterSpace isMinibatch; - protected ParameterSpace lockGammaBeta; - protected ParameterSpace gamma; - protected ParameterSpace beta; - protected ParameterSpace> constrainBeta; - protected ParameterSpace> constrainGamma; - - private BatchNormalizationSpace(Builder builder) { - super(builder); - this.decay = builder.decay; - this.eps = builder.eps; - this.isMinibatch = builder.isMinibatch; - this.lockGammaBeta = builder.lockGammaBeta; - this.gamma = builder.gamma; - this.beta = builder.beta; - this.constrainBeta = builder.betaConstraints; - this.constrainGamma = builder.gammaConstraints; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public BatchNormalization getValue(double[] parameterValues) { - BatchNormalization.Builder b = new BatchNormalization.Builder(); - setLayerOptionsBuilder(b, parameterValues); - return b.build(); - } - - protected void setLayerOptionsBuilder(BatchNormalization.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (decay != null) - builder.decay(decay.getValue(values)); - if (eps != null) - builder.eps(eps.getValue(values)); - if (isMinibatch != null) - builder.minibatch(isMinibatch.getValue(values)); - if (lockGammaBeta != null) - builder.lockGammaBeta(lockGammaBeta.getValue(values)); - if (gamma != null) - builder.gamma(gamma.getValue(values)); - if (beta != null) - builder.beta(beta.getValue(values)); - if (constrainBeta != null){ - List c = constrainBeta.getValue(values); - if(c != null){ - builder.constrainBeta(c.toArray(new LayerConstraint[0])); - } - } - if (constrainGamma != null){ - List c = constrainGamma.getValue(values); - if(c != null){ - builder.constrainGamma(c.toArray(new LayerConstraint[0])); - } - } - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - StringBuilder sb = new StringBuilder(); - sb.append("BatchNormalizationSpace(").append(super.toString(delim)); - if (decay != null) - sb.append("decay: ").append(decay).append(delim); - if (eps != null) - sb.append("eps: ").append(eps).append(delim); - if (isMinibatch != null) - sb.append("isMinibatch: ").append(isMinibatch).append(delim); - if (lockGammaBeta != null) - sb.append("lockGammaBeta: ").append(lockGammaBeta).append(delim); - if (gamma != null) - sb.append("gamma: ").append(gamma).append(delim); - if (beta != null) - sb.append("beta: ").append(beta).append(delim); - sb.append(")"); - return sb.toString(); - } - - public static class Builder extends FeedForwardLayerSpace.Builder { - - protected ParameterSpace decay; - protected ParameterSpace eps; - protected ParameterSpace isMinibatch; - protected ParameterSpace lockGammaBeta; - protected ParameterSpace gamma; - protected ParameterSpace beta; - protected ParameterSpace> betaConstraints; - protected ParameterSpace> gammaConstraints; - - public Builder minibatch(boolean minibatch) { - return minibatch(new FixedValue<>(minibatch)); - } - - public Builder minibatch(ParameterSpace minibatch) { - this.isMinibatch = minibatch; - return this; - } - - public Builder gamma(double gamma) { - return gamma(new FixedValue<>(gamma)); - } - - public Builder gamma(ParameterSpace gamma) { - this.gamma = gamma; - return this; - } - - public Builder beta(double beta) { - return beta(new FixedValue<>(beta)); - } - - public Builder beta(ParameterSpace beta) { - this.beta = beta; - return this; - } - - public Builder eps(double eps) { - return eps(new FixedValue<>(eps)); - } - - public Builder eps(ParameterSpace eps) { - this.eps = eps; - return this; - } - - public Builder decay(double decay) { - return decay(new FixedValue(decay)); - } - - public Builder decay(ParameterSpace decay) { - this.decay = decay; - return this; - } - - public Builder lockGammaBeta(boolean lockGammaBeta) { - return lockGammaBeta(new FixedValue<>(lockGammaBeta)); - } - - public Builder lockGammaBeta(ParameterSpace lockGammaBeta) { - this.lockGammaBeta = lockGammaBeta; - return this; - } - - public Builder constrainBeta(LayerConstraint... constraints) { - return constrainBeta(new FixedValue<>(Arrays.asList(constraints))); - } - - public Builder constrainBeta(ParameterSpace> constraints) { - this.betaConstraints = constraints; - return this; - } - - public Builder constrainGamma(LayerConstraint... constraints) { - return constrainGamma(new FixedValue<>(Arrays.asList(constraints))); - } - - public Builder constrainGamma(ParameterSpace> constraints) { - this.gammaConstraints = constraints; - return this; - } - - - @Override - public BatchNormalizationSpace build() { - return new BatchNormalizationSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/Bidirectional.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/Bidirectional.java deleted file mode 100644 index 04ab1e6fb..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/Bidirectional.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.Data; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.nn.conf.layers.Layer; - -import java.util.List; - -/** - * Bidirectional layer wrapper. Can be used wrap an existing layer space, in the same way that - * {@link org.deeplearning4j.nn.conf.layers.recurrent.Bidirectional} wraps a DL4J layer - * - * @author Alex Black - */ -@NoArgsConstructor //JSON -@Data -public class Bidirectional extends LayerSpace { - - protected LayerSpace layerSpace; - - public Bidirectional(LayerSpace layerSpace){ - this.layerSpace = layerSpace; - } - - @Override - public Layer getValue(double[] parameterValues) { - Layer underlying = layerSpace.getValue(parameterValues); - return new org.deeplearning4j.nn.conf.layers.recurrent.Bidirectional(underlying); - } - - @Override - public int numParameters() { - return layerSpace.numParameters(); - } - - @Override - public List collectLeaves() { - return layerSpace.collectLeaves(); - } - - @Override - public boolean isLeaf() { - return layerSpace.isLeaf(); - } - - @Override - public void setIndices(int... indices) { - layerSpace.setIndices(indices); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/CenterLossOutputLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/CenterLossOutputLayerSpace.java deleted file mode 100644 index fa1c07058..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/CenterLossOutputLayerSpace.java +++ /dev/null @@ -1,89 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.CenterLossOutputLayer; - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public class CenterLossOutputLayerSpace extends BaseOutputLayerSpace { - - ParameterSpace alpha; - ParameterSpace lambda; - - protected CenterLossOutputLayerSpace(Builder builder){ - super(builder); - this.alpha = builder.alpha; - this.lambda = builder.lambda; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public CenterLossOutputLayer getValue(double[] parameterValues) { - CenterLossOutputLayer.Builder b = new CenterLossOutputLayer.Builder(); - setLayerOptionsBuilder(b, parameterValues); - return b.build(); - } - - protected void setLayerBuilderOptions(CenterLossOutputLayer.Builder builder, double[] values){ - super.setLayerOptionsBuilder(builder, values); - if(alpha != null) - builder.alpha(alpha.getValue(values)); - if(lambda != null) - builder.lambda(lambda.getValue(values)); - } - - public static class Builder extends BaseOutputLayerSpace.Builder { - - ParameterSpace alpha; - ParameterSpace lambda; - - public Builder alpha(double alpha){ - return alpha(new FixedValue<>(alpha)); - } - - public Builder alpha(ParameterSpace alpha){ - this.alpha = alpha; - return this; - } - - public Builder lambda(double lambda){ - return lambda(new FixedValue<>(lambda)); - } - - public Builder lambda(ParameterSpace lambda){ - this.lambda = lambda; - return this; - } - - @Override - public CenterLossOutputLayerSpace build() { - return new CenterLossOutputLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/ConvolutionLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/ConvolutionLayerSpace.java deleted file mode 100644 index 40b34e746..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/ConvolutionLayerSpace.java +++ /dev/null @@ -1,174 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.ConvolutionMode; -import org.deeplearning4j.nn.conf.layers.ConvolutionLayer; - -/** - * Layer space for convolutional layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class ConvolutionLayerSpace extends FeedForwardLayerSpace { - protected ParameterSpace dilation; - protected ParameterSpace kernelSize; - protected ParameterSpace stride; - protected ParameterSpace padding; - protected ParameterSpace convolutionMode; - protected ParameterSpace hasBias; - - private ConvolutionLayerSpace(Builder builder) { - super(builder); - this.dilation = builder.dilation; - this.kernelSize = builder.kernelSize; - this.stride = builder.stride; - this.padding = builder.padding; - this.convolutionMode = builder.convolutionMode; - this.hasBias = builder.hasBias; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public ConvolutionLayer getValue(double[] values) { - ConvolutionLayer.Builder b = new ConvolutionLayer.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(ConvolutionLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (dilation != null) - builder.dilation(dilation.getValue(values)); - if (kernelSize != null) - builder.kernelSize(kernelSize.getValue(values)); - if (stride != null) - builder.stride(stride.getValue(values)); - if (padding != null) - builder.padding(padding.getValue(values)); - if (convolutionMode != null) - builder.convolutionMode(convolutionMode.getValue(values)); - if (hasBias != null) - builder.hasBias(hasBias.getValue(values)); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - StringBuilder sb = new StringBuilder("ConvolutionLayerSpace("); - if (dilation != null) - sb.append("dilation: ").append(dilation).append(delim); - if (kernelSize != null) - sb.append("kernelSize: ").append(kernelSize).append(delim); - if (stride != null) - sb.append("stride: ").append(stride).append(delim); - if (padding != null) - sb.append("padding: ").append(padding).append(delim); - if (convolutionMode != null) - sb.append("convolutionMode: ").append(convolutionMode).append(delim); - if (hasBias != null) - sb.append("hasBias: ").append(hasBias).append(delim); - sb.append(super.toString(delim)).append(")"); - return sb.toString(); - } - - - public static class Builder extends FeedForwardLayerSpace.Builder { - protected ParameterSpace dilation; - protected ParameterSpace kernelSize; - protected ParameterSpace stride; - protected ParameterSpace padding; - protected ParameterSpace convolutionMode; - protected ParameterSpace hasBias; - - public Builder dilation(int... dilation) { - return dilation(new FixedValue<>(dilation)); - } - - public Builder dilation(ParameterSpace dilation) { - this.dilation = dilation; - return this; - } - public Builder kernelSize(int... kernelSize) { - return kernelSize(new FixedValue<>(kernelSize)); - } - - public Builder kernelSize(ParameterSpace kernelSize) { - this.kernelSize = kernelSize; - return this; - } - - public Builder stride(int... stride) { - return stride(new FixedValue<>(stride)); - } - - public Builder stride(ParameterSpace stride) { - this.stride = stride; - return this; - } - - public Builder padding(int... padding) { - return padding(new FixedValue<>(padding)); - } - - public Builder padding(ParameterSpace padding) { - this.padding = padding; - return this; - } - - public Builder convolutionMode(ConvolutionMode convolutionMode) { - return convolutionMode(new FixedValue<>(convolutionMode)); - } - - public Builder convolutionMode(ParameterSpace convolutionMode) { - this.convolutionMode = convolutionMode; - return this; - } - - public Builder hasBias(boolean hasBias){ - return hasBias(new FixedValue<>(hasBias)); - } - - public Builder hasBias(ParameterSpace hasBias){ - this.hasBias = hasBias; - return this; - } - - public ConvolutionLayerSpace build() { - return new ConvolutionLayerSpace(this); - } - - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/Deconvolution2DLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/Deconvolution2DLayerSpace.java deleted file mode 100644 index 2fd219de3..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/Deconvolution2DLayerSpace.java +++ /dev/null @@ -1,54 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.nn.conf.layers.Deconvolution2D; - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public class Deconvolution2DLayerSpace extends BaseConvolutionLayerSpace { - - protected Deconvolution2DLayerSpace(Builder builder) { - super(builder); - } - - @Override - public Deconvolution2D getValue(double[] parameterValues) { - Deconvolution2D.Builder b = new Deconvolution2D.Builder(); - setLayerOptionsBuilder(b, parameterValues); - return b.build(); - } - - protected void setLayerOptionsBuilder(Deconvolution2D.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - } - - public static class Builder extends BaseConvolutionLayerSpace.Builder { - @Override - public Deconvolution2DLayerSpace build() { - return new Deconvolution2DLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/DenseLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/DenseLayerSpace.java deleted file mode 100644 index ef88a7110..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/DenseLayerSpace.java +++ /dev/null @@ -1,92 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.DenseLayer; - -/** - * layer hyperparameter configuration space for dense layers (i.e., multi-layer perceptron layers) - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor //For Jackson JSON/YAML deserialization -public class DenseLayerSpace extends FeedForwardLayerSpace { - - protected ParameterSpace hasBias; - - private DenseLayerSpace(Builder builder) { - super(builder); - - this.hasBias = builder.hasBias; - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public DenseLayer getValue(double[] values) { - //Using the builder here, to get default options - DenseLayer.Builder b = new DenseLayer.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(DenseLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if(hasBias != null) - builder.hasBias(hasBias.getValue(values)); - } - - public static class Builder extends FeedForwardLayerSpace.Builder { - - protected ParameterSpace hasBias; - - public Builder hasBias(boolean hasBias){ - return hasBias(new FixedValue<>(hasBias)); - } - - public Builder hasBias(ParameterSpace hasBias){ - this.hasBias = hasBias; - return this; - } - - @Override - @SuppressWarnings("unchecked") - public DenseLayerSpace build() { - return new DenseLayerSpace(this); - } - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - return "DenseLayerSpace(" + super.toString(delim) + ")"; - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/DropoutLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/DropoutLayerSpace.java deleted file mode 100644 index 1208700ee..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/DropoutLayerSpace.java +++ /dev/null @@ -1,89 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.*; -import org.deeplearning4j.arbiter.dropout.DropoutSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.conf.dropout.IDropout; -import org.deeplearning4j.nn.conf.layers.DropoutLayer; - -import java.util.List; - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public class DropoutLayerSpace extends LayerSpace { - - public DropoutLayerSpace(@NonNull ParameterSpace dropout){ - this.dropOut = dropout; - } - - protected DropoutLayerSpace(Builder builder){ - super(builder); - } - - @Override - public DropoutLayer getValue(double[] parameterValues) { - return new DropoutLayer.Builder().dropOut(dropOut.getValue(parameterValues)).build(); - } - - @Override - public int numParameters() { - return dropOut.numParameters(); - } - - @Override - public List collectLeaves() { - return dropOut.collectLeaves(); - } - - - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - dropOut.setIndices(indices); - } - - public static class Builder extends LayerSpace.Builder { - - public Builder dropOut(double d){ - return iDropOut(new DropoutSpace(new FixedValue<>(d))); - } - - public Builder dropOut(ParameterSpace dropOut){ - return iDropOut(new DropoutSpace(dropOut)); - } - - public Builder iDropOut(ParameterSpace dropout){ - this.dropOut = dropout; - return this; - } - - public DropoutLayerSpace build(){ - return new DropoutLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/EmbeddingLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/EmbeddingLayerSpace.java deleted file mode 100644 index e5ca4b391..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/EmbeddingLayerSpace.java +++ /dev/null @@ -1,90 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.EmbeddingLayer; - -/** - * Layer hyperparameter configuration space for {@link org.deeplearning4j.nn.conf.layers.EmbeddingLayer} - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class EmbeddingLayerSpace extends FeedForwardLayerSpace { - private ParameterSpace hasBias; - - private EmbeddingLayerSpace(Builder builder) { - super(builder); - this.hasBias = builder.hasBias; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public EmbeddingLayer getValue(double[] values) { - //Using the builder here, to get default options - EmbeddingLayer.Builder b = new EmbeddingLayer.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(EmbeddingLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if(hasBias != null) - builder.hasBias(hasBias.getValue(values)); - } - - public static class Builder extends FeedForwardLayerSpace.Builder { - protected ParameterSpace hasBias; - - public Builder hasBias(boolean hasBias){ - return hasBias(new FixedValue<>(hasBias)); - } - - public Builder hasBias(ParameterSpace hasBias){ - this.hasBias = hasBias; - return this; - } - - @Override - @SuppressWarnings("unchecked") - public EmbeddingLayerSpace build() { - return new EmbeddingLayerSpace(this); - } - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - return "EmbeddingLayerSpace(" + super.toString(delim) + ")"; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/FeedForwardLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/FeedForwardLayerSpace.java deleted file mode 100644 index 7966cc6b2..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/FeedForwardLayerSpace.java +++ /dev/null @@ -1,156 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.api.layers.LayerConstraint; -import org.deeplearning4j.nn.conf.layers.FeedForwardLayer; - -import java.util.Arrays; -import java.util.List; - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor //For Jackson JSON/YAML deserialization -public abstract class FeedForwardLayerSpace extends BaseLayerSpace { - protected ParameterSpace nIn; - protected ParameterSpace nOut; - protected ParameterSpace> constrainWeights; - protected ParameterSpace> constrainBias; - protected ParameterSpace> constrainAll; - - - protected FeedForwardLayerSpace(Builder builder) { - super(builder); - nIn = builder.nIn; - nOut = builder.nOut; - constrainWeights = builder.constrainWeights; - constrainBias = builder.constrainBias; - constrainAll = builder.constrainAll; - } - - protected void setLayerOptionsBuilder(FeedForwardLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (nIn != null) - builder.nIn(nIn.getValue(values)); - if (nOut != null) - builder.nOut(nOut.getValue(values)); - if (constrainWeights != null){ - List c = constrainWeights.getValue(values); - if(c != null){ - builder.constrainWeights(c.toArray(new LayerConstraint[0])); - } - } - if (constrainBias != null){ - List c = constrainBias.getValue(values); - if(c != null){ - builder.constrainBias(c.toArray(new LayerConstraint[0])); - } - } - if (constrainAll != null){ - List c = constrainAll.getValue(values); - if(c != null){ - builder.constrainAllParameters(c.toArray(new LayerConstraint[0])); - } - } - - } - - - public abstract static class Builder extends BaseLayerSpace.Builder { - - protected ParameterSpace nIn; - protected ParameterSpace nOut; - protected ParameterSpace> constrainWeights; - protected ParameterSpace> constrainBias; - protected ParameterSpace> constrainAll; - - public T nIn(int nIn) { - return nIn(new FixedValue<>(nIn)); - } - - public T nIn(ParameterSpace nIn) { - this.nIn = nIn; - return (T) this; - } - - public T nOut(int nOut) { - return nOut(new FixedValue<>(nOut)); - } - - public T nOut(ParameterSpace nOut) { - this.nOut = nOut; - return (T) this; - } - - public T constrainWeights(LayerConstraint... constraints){ - return constrainWeights(new FixedValue>(Arrays.asList(constraints))); - } - - public T constrainWeights(ParameterSpace> constraints){ - this.constrainWeights = constraints; - return (T) this; - } - - public T constrainBias(LayerConstraint... constraints){ - return constrainBias(new FixedValue>(Arrays.asList(constraints))); - } - - public T constrainBias(ParameterSpace> constraints){ - this.constrainBias = constraints; - return (T) this; - } - - public T constrainAllParams(LayerConstraint... constraints){ - return constrainAllParams(new FixedValue>(Arrays.asList(constraints))); - } - - public T constrainAllParams(ParameterSpace> constraints){ - this.constrainAll = constraints; - return (T) this; - } - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - protected String toString(String delim) { - StringBuilder sb = new StringBuilder(); - if (nIn != null) - sb.append("nIn: ").append(nIn).append(delim); - if (nOut != null) - sb.append("nOut: ").append(nOut).append(delim); - if (constrainWeights != null) - sb.append("constrainWeights: ").append(constrainWeights).append(delim); - if (constrainBias != null) - sb.append("constrainBias: ").append(constrainBias).append(delim); - if (constrainAll != null) - sb.append("constrainAllParams: ").append(constrainAll).append(delim); - sb.append(super.toString(delim)); - return sb.toString(); - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GlobalPoolingLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GlobalPoolingLayerSpace.java deleted file mode 100644 index d557b9596..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GlobalPoolingLayerSpace.java +++ /dev/null @@ -1,137 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.GlobalPoolingLayer; -import org.deeplearning4j.nn.conf.layers.PoolingType; - -/** - * Layer space for a {@link GlobalPoolingLayer} - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class GlobalPoolingLayerSpace extends LayerSpace { - - protected ParameterSpace poolingDimensions; - protected ParameterSpace collapseDimensions; - protected ParameterSpace poolingType; - protected ParameterSpace pNorm; - - private int numParameters; - - private GlobalPoolingLayerSpace(Builder builder) { - super(builder); - this.poolingDimensions = builder.poolingDimensions; - this.collapseDimensions = builder.collapseDimensions; - this.poolingType = builder.poolingType; - this.pNorm = builder.pNorm; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public GlobalPoolingLayer getValue(double[] parameterValues) { - GlobalPoolingLayer.Builder builder = new GlobalPoolingLayer.Builder(); - super.setLayerOptionsBuilder(builder, parameterValues); - if (poolingDimensions != null) - builder.poolingDimensions(poolingDimensions.getValue(parameterValues)); - if (collapseDimensions != null) - builder.collapseDimensions(collapseDimensions.getValue(parameterValues)); - if (poolingType != null) - builder.poolingType(poolingType.getValue(parameterValues)); - if (pNorm != null) - builder.pnorm(pNorm.getValue(parameterValues)); - return builder.build(); - } - - @Override - public int numParameters() { - return numParameters; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - throw new UnsupportedOperationException("Cannot set indices for non-leaf parameter space"); - } - - - - public static class Builder extends LayerSpace.Builder { - - protected ParameterSpace poolingDimensions; - protected ParameterSpace collapseDimensions; - protected ParameterSpace poolingType; - protected ParameterSpace pNorm; - - public Builder poolingDimensions(int... poolingDimensions) { - return poolingDimensions(new FixedValue<>(poolingDimensions)); - } - - public Builder poolingDimensions(ParameterSpace poolingDimensions) { - this.poolingDimensions = poolingDimensions; - return this; - } - - public Builder collapseDimensions(boolean collapseDimensions) { - return collapseDimensions(new FixedValue<>(collapseDimensions)); - } - - public Builder collapseDimensions(ParameterSpace collapseDimensions) { - this.collapseDimensions = collapseDimensions; - return this; - } - - public Builder poolingType(PoolingType poolingType) { - return poolingType(new FixedValue<>(poolingType)); - } - - public Builder poolingType(ParameterSpace poolingType) { - this.poolingType = poolingType; - return this; - } - - public Builder pNorm(int pNorm) { - return pNorm(new FixedValue<>(pNorm)); - } - - public Builder pNorm(ParameterSpace pNorm) { - this.pNorm = pNorm; - return this; - } - - public GlobalPoolingLayerSpace build() { - return new GlobalPoolingLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GravesBidirectionalLSTMLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GravesBidirectionalLSTMLayerSpace.java deleted file mode 100644 index 8f334585e..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GravesBidirectionalLSTMLayerSpace.java +++ /dev/null @@ -1,99 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.GravesBidirectionalLSTM; - -import java.util.List; - -/** - * Layer space for Bidirectional LSTM layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class GravesBidirectionalLSTMLayerSpace extends FeedForwardLayerSpace { - - private ParameterSpace forgetGateBiasInit; - - private GravesBidirectionalLSTMLayerSpace(Builder builder) { - super(builder); - this.forgetGateBiasInit = builder.forgetGateBiasInit; - - List l = collectLeaves(); - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - - @Override - public GravesBidirectionalLSTM getValue(double[] values) { - GravesBidirectionalLSTM.Builder b = new GravesBidirectionalLSTM.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(GravesBidirectionalLSTM.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (forgetGateBiasInit != null) - builder.forgetGateBiasInit(forgetGateBiasInit.getValue(values)); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - StringBuilder sb = new StringBuilder("GravesBidirectionalLSTMLayerSpace("); - if (forgetGateBiasInit != null) - sb.append("forgetGateBiasInit: ").append(forgetGateBiasInit).append(delim); - sb.append(super.toString(delim)).append(")"); - return sb.toString(); - } - - public static class Builder extends FeedForwardLayerSpace.Builder { - - private ParameterSpace forgetGateBiasInit; - - public Builder forgetGateBiasInit(double forgetGateBiasInit) { - return forgetGateBiasInit(new FixedValue<>(forgetGateBiasInit)); - } - - public Builder forgetGateBiasInit(ParameterSpace forgetGateBiasInit) { - this.forgetGateBiasInit = forgetGateBiasInit; - return this; - } - - @Override - @SuppressWarnings("unchecked") - public GravesBidirectionalLSTMLayerSpace build() { - return new GravesBidirectionalLSTMLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GravesLSTMLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GravesLSTMLayerSpace.java deleted file mode 100644 index a68311747..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/GravesLSTMLayerSpace.java +++ /dev/null @@ -1,74 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.GravesLSTM; - -/** - * Layer space for LSTM layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class GravesLSTMLayerSpace extends AbstractLSTMLayerSpace { - - private GravesLSTMLayerSpace(Builder builder) { - super(builder); - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - - @Override - public GravesLSTM getValue(double[] values) { - GravesLSTM.Builder b = new GravesLSTM.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(GravesLSTM.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - return "GravesLSTMLayerSpace(" + super.toString(delim) + ")"; - } - - public static class Builder extends AbstractLSTMLayerSpace.Builder { - - @Override - @SuppressWarnings("unchecked") - public GravesLSTMLayerSpace build() { - return new GravesLSTMLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LSTMLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LSTMLayerSpace.java deleted file mode 100644 index e71fe0a9a..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LSTMLayerSpace.java +++ /dev/null @@ -1,74 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.LSTM; - -/** - * Layer space for LSTM layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class LSTMLayerSpace extends AbstractLSTMLayerSpace { - - private LSTMLayerSpace(Builder builder) { - super(builder); - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - - @Override - public LSTM getValue(double[] values) { - LSTM.Builder b = new LSTM.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(LSTM.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - return "LSTMLayerSpace(" + super.toString(delim) + ")"; - } - - public static class Builder extends AbstractLSTMLayerSpace.Builder { - - @Override - @SuppressWarnings("unchecked") - public LSTMLayerSpace build() { - return new LSTMLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LayerSpace.java deleted file mode 100644 index 2f79bbfc1..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LayerSpace.java +++ /dev/null @@ -1,140 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.dropout.DropoutSpace; -import org.deeplearning4j.arbiter.optimize.api.AbstractParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.conf.dropout.IDropout; -import org.deeplearning4j.nn.conf.layers.Layer; -import org.nd4j.shade.jackson.annotation.JsonInclude; - -import java.util.ArrayList; -import java.util.LinkedList; -import java.util.List; -import java.util.Map; - -/** - * LayerSpace contains common Layer hyperparameters; should match {@link Layer} in terms of features - * - * @author Alex Black - */ -@JsonInclude(JsonInclude.Include.NON_NULL) -@Data -@EqualsAndHashCode(callSuper = false) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public abstract class LayerSpace extends AbstractParameterSpace { - protected ParameterSpace dropOut; - protected int numParameters; - - protected LayerSpace(Builder builder) { - this.dropOut = builder.dropOut; - } - - @Override - public List collectLeaves() { - //To avoid manually coding EVERY parameter, in every layer: - // Do a depth-first search of nested spaces - LinkedList stack = new LinkedList<>(); - stack.add(this); - - List out = new ArrayList<>(); - while (!stack.isEmpty()) { - ParameterSpace next = stack.removeLast(); - if (next.isLeaf()) { - out.add(next); - } else { - Map m = next.getNestedSpaces(); - ParameterSpace[] arr = m.values().toArray(new ParameterSpace[0]); - for (int i = arr.length - 1; i >= 0; i--) { - stack.add(arr[i]); - } - } - } - - return out; - } - - @Override - public int numParameters() { - return numParameters; - } - - @Override - public boolean isLeaf() { - return false; - } - - @Override - public void setIndices(int... indices) { - throw new UnsupportedOperationException("Cannot set indices for non-leaf parameter space"); - } - - - protected void setLayerOptionsBuilder(Layer.Builder builder, double[] values) { - if (dropOut != null) - builder.dropOut(dropOut.getValue(values)); - } - - - @Override - public String toString() { - return toString(", "); - } - - protected String toString(String delim) { - StringBuilder sb = new StringBuilder(); - if (dropOut != null) - sb.append("dropOut: ").append(dropOut).append(delim); - String s = sb.toString(); - - if (s.endsWith(delim)) { - //Remove final delimiter - int last = s.lastIndexOf(delim); - return s.substring(0, last); - } else - return s; - } - - @SuppressWarnings("unchecked") - public abstract static class Builder { - protected ParameterSpace dropOut; - - public T dropOut(double dropOut) { - return dropOut(new FixedValue<>(dropOut)); - } - - public T dropOut(ParameterSpace dropOut) { - return iDropOut(new DropoutSpace(dropOut)); - } - - public T iDropOut(ParameterSpace dropOut){ - this.dropOut = dropOut; - return (T) this; - } - - public abstract E build(); - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LocalResponseNormalizationLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LocalResponseNormalizationLayerSpace.java deleted file mode 100644 index bb8144257..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LocalResponseNormalizationLayerSpace.java +++ /dev/null @@ -1,121 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.LocalResponseNormalization; - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class LocalResponseNormalizationLayerSpace extends LayerSpace { - - private ParameterSpace n; - private ParameterSpace k; - private ParameterSpace alpha; - private ParameterSpace beta; - - - private LocalResponseNormalizationLayerSpace(Builder builder) { - super(builder); - this.n = builder.n; - this.k = builder.k; - this.alpha = builder.alpha; - this.beta = builder.beta; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public LocalResponseNormalization getValue(double[] values) { - LocalResponseNormalization.Builder b = new LocalResponseNormalization.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(LocalResponseNormalization.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (n != null) - builder.n(n.getValue(values)); - if (k != null) - builder.k(k.getValue(values)); - if (alpha != null) - builder.alpha(alpha.getValue(values)); - if (beta != null) - builder.beta(beta.getValue(values)); - } - - - public static class Builder extends LayerSpace.Builder { - - private ParameterSpace n; - private ParameterSpace k; - private ParameterSpace alpha; - private ParameterSpace beta; - - - public Builder n(double n) { - return n(new FixedValue<>(n)); - } - - public Builder n(ParameterSpace n) { - this.n = n; - return this; - } - - public Builder k(double k) { - return k(new FixedValue<>(k)); - } - - public Builder k(ParameterSpace k) { - this.k = k; - return this; - } - - public Builder alpha(double alpha) { - return alpha(new FixedValue<>(alpha)); - } - - public Builder alpha(ParameterSpace alpha) { - this.alpha = alpha; - return this; - } - - public Builder beta(double beta) { - return beta(new FixedValue<>(beta)); - } - - public Builder beta(ParameterSpace beta) { - this.beta = beta; - return this; - } - - public LocalResponseNormalizationLayerSpace build() { - return new LocalResponseNormalizationLayerSpace(this); - } - - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LossLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LossLayerSpace.java deleted file mode 100644 index 5e06d0209..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/LossLayerSpace.java +++ /dev/null @@ -1,107 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.adapter.ActivationParameterSpaceAdapter; -import org.deeplearning4j.arbiter.adapter.LossFunctionParameterSpaceAdapter; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.LossLayer; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; -import org.nd4j.linalg.lossfunctions.ILossFunction; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public class LossLayerSpace extends LayerSpace { - - private ParameterSpace activationFunction; - protected ParameterSpace lossFunction; - - public LossLayerSpace(Builder builder){ - super(builder); - this.activationFunction = builder.activationFunction; - this.lossFunction = builder.lossFunction; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public LossLayer getValue(double[] parameterValues) { - LossLayer.Builder b = new LossLayer.Builder(); - if(activationFunction != null) - b.activation(activationFunction.getValue(parameterValues)); - if(lossFunction != null) - b.lossFunction(lossFunction.getValue(parameterValues)); - return b.build(); - } - - - public static class Builder extends LayerSpace.Builder{ - - private ParameterSpace activationFunction; - protected ParameterSpace lossFunction; - - public Builder lossFunction(LossFunctions.LossFunction lossFunction) { - return lossFunction(new FixedValue<>(lossFunction)); - } - - public Builder lossFunction(ParameterSpace lossFunction) { - return iLossFunction(new LossFunctionParameterSpaceAdapter(lossFunction)); - } - - public Builder iLossFunction(ILossFunction lossFunction) { - return iLossFunction(new FixedValue<>(lossFunction)); - } - - public Builder iLossFunction(ParameterSpace lossFunction) { - this.lossFunction = lossFunction; - return this; - } - - public Builder activation(Activation activation) { - return activation(new FixedValue<>(activation)); - } - - public Builder activation(IActivation iActivation) { - return activationFn(new FixedValue<>(iActivation)); - } - - public Builder activation(ParameterSpace activationFunction) { - return activationFn(new ActivationParameterSpaceAdapter(activationFunction)); - } - - public Builder activationFn(ParameterSpace activationFunction) { - this.activationFunction = activationFunction; - return this; - } - - @Override - public LossLayerSpace build() { - return new LossLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/OCNNLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/OCNNLayerSpace.java deleted file mode 100644 index e7c94b084..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/OCNNLayerSpace.java +++ /dev/null @@ -1,155 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.ocnn.OCNNOutputLayer; - - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class OCNNLayerSpace extends BaseOutputLayerSpace { - - - protected ParameterSpace nuSpace; - protected ParameterSpace initialRValue; - protected ParameterSpace hiddenLayerSize; - protected ParameterSpace windowSize; - protected ParameterSpace configureR; - - private OCNNLayerSpace(Builder builder) { - super(builder); - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - this.nuSpace = builder.nuSpace; - this.initialRValue = builder.initialRValue; - this.hiddenLayerSize = builder.hiddenLayerSize; - this.configureR = builder.configureR; - } - - - @Override - public OCNNOutputLayer getValue(double[] parameterValues) { - OCNNOutputLayer.Builder o = new OCNNOutputLayer.Builder(); - setLayerOptionsBuilder(o, parameterValues); - return o.build(); - } - - protected void setLayerOptionsBuilder(OCNNOutputLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - builder.nu(nuSpace.getValue(values)); - builder.hiddenLayerSize(hiddenLayerSize.getValue(values)); - builder.initialRValue(initialRValue.getValue(values)); - builder.configureR(configureR.getValue(values)); - builder.windowSize(windowSize.getValue(values)); - } - - - public static class Builder extends BaseOutputLayerSpace.Builder { - protected ParameterSpace nuSpace; - protected ParameterSpace initialRValue; - protected ParameterSpace hiddenLayerSize; - protected ParameterSpace windowSize; - protected ParameterSpace configureR; - - public Builder nu(ParameterSpace nuSpace) { - this.nuSpace = nuSpace; - return this; - } - - /** - * Use hiddenLayerSize instead - * @param numHiddenSpace - * @return - */ - @Deprecated - public Builder numHidden(ParameterSpace numHiddenSpace) { - return hiddenLayerSize(numHiddenSpace); - } - - /** - * Use hiddenLayerSize instead - * @param numHidden - * @return - */ - @Deprecated - public Builder numHidden(int numHidden) { - return hiddenLayerSize(numHidden); - } - - public Builder hiddenLayerSize(ParameterSpace hiddenLayerSize) { - this.hiddenLayerSize = hiddenLayerSize; - return this; - } - - public Builder hiddenLayerSize(int hiddenLayerSize) { - this.hiddenLayerSize = new FixedValue<>(hiddenLayerSize); - return this; - } - - public Builder nu(double nu) { - this.nuSpace = new FixedValue<>(nu); - return this; - } - - public Builder initialRValue(double initialRValue) { - this.initialRValue = new FixedValue<>(initialRValue); - return this; - } - - public Builder initialRValue(ParameterSpace initialRValue) { - this.initialRValue = initialRValue; - return this; - } - - public Builder windowSize(int windowSize) { - this.windowSize = new FixedValue<>(windowSize); - return this; - } - - public Builder windowSize(ParameterSpace windowSize) { - this.windowSize = windowSize; - return this; - } - - public Builder configureR(boolean configureR) { - this.configureR = new FixedValue<>(configureR); - return this; - } - - public Builder configureR(ParameterSpace configureR) { - this.configureR = configureR; - return this; - } - - - @Override - @SuppressWarnings("unchecked") - public OCNNLayerSpace build() { - return new OCNNLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/OutputLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/OutputLayerSpace.java deleted file mode 100644 index dc245ab14..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/OutputLayerSpace.java +++ /dev/null @@ -1,73 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.OutputLayer; - -/** - * Layer hyperparameter configuration space for output layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class OutputLayerSpace extends BaseOutputLayerSpace { - - private OutputLayerSpace(Builder builder) { - super(builder); - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public OutputLayer getValue(double[] values) { - OutputLayer.Builder o = new OutputLayer.Builder(); - setLayerOptionsBuilder(o, values); - return o.build(); - } - - protected void setLayerOptionsBuilder(OutputLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - } - - public static class Builder extends BaseOutputLayerSpace.Builder { - - @Override - @SuppressWarnings("unchecked") - public OutputLayerSpace build() { - return new OutputLayerSpace(this); - } - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - return "OutputLayerSpace(" + super.toString(delim) + ")"; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/RnnOutputLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/RnnOutputLayerSpace.java deleted file mode 100644 index c2323c79c..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/RnnOutputLayerSpace.java +++ /dev/null @@ -1,73 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.RnnOutputLayer; - -/** - * Layer hyperparametor configuration space for RnnOutputLayer - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class RnnOutputLayerSpace extends BaseOutputLayerSpace { - - private RnnOutputLayerSpace(Builder builder) { - super(builder); - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public RnnOutputLayer getValue(double[] values) { - RnnOutputLayer.Builder b = new RnnOutputLayer.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(RnnOutputLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - return "RnnOutputLayerSpace(" + super.toString(delim) + ")"; - } - - public static class Builder extends BaseOutputLayerSpace.Builder { - - @Override - @SuppressWarnings("unchecked") - public RnnOutputLayerSpace build() { - return new RnnOutputLayerSpace(this); - } - } - - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/SeparableConvolution2DLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/SeparableConvolution2DLayerSpace.java deleted file mode 100644 index e837b92fe..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/SeparableConvolution2DLayerSpace.java +++ /dev/null @@ -1,103 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.nn.api.layers.LayerConstraint; -import org.deeplearning4j.nn.conf.layers.SeparableConvolution2D; - -import java.util.Arrays; -import java.util.List; - -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For Jackson JSON/YAML deserialization -public class SeparableConvolution2DLayerSpace extends BaseConvolutionLayerSpace { - - private ParameterSpace depthMultiplier; - protected ParameterSpace> pointWiseConstraints; - - protected SeparableConvolution2DLayerSpace(Builder builder){ - super(builder); - this.depthMultiplier = builder.depthMultiplier; - this.pointWiseConstraints = builder.pointWiseConstraints; - } - - @Override - public SeparableConvolution2D getValue(double[] parameterValues) { - SeparableConvolution2D.Builder b = new SeparableConvolution2D.Builder(); - setLayerOptionsBuilder(b, parameterValues); - return b.build(); - } - - protected void setLayerOptionsBuilder(SeparableConvolution2D.Builder builder, double[] values){ - super.setLayerOptionsBuilder(builder, values); - if (kernelSize != null) - builder.kernelSize(kernelSize.getValue(values)); - if (stride != null) - builder.stride(stride.getValue(values)); - if (padding != null) - builder.padding(padding.getValue(values)); - if (convolutionMode != null) - builder.convolutionMode(convolutionMode.getValue(values)); - if (hasBias != null) - builder.hasBias(hasBias.getValue(values)); - if (depthMultiplier != null) - builder.depthMultiplier(depthMultiplier.getValue(values)); - if (pointWiseConstraints != null){ - List c = pointWiseConstraints.getValue(values); - if(c != null){ - builder.constrainPointWise(c.toArray(new LayerConstraint[0])); - } - } - } - - - public static class Builder extends BaseConvolutionLayerSpace.Builder{ - private ParameterSpace depthMultiplier; - protected ParameterSpace> pointWiseConstraints; - - public Builder constrainPointWise(LayerConstraint... constraints){ - return constrainPointWise(new FixedValue>(Arrays.asList(constraints))); - } - - public Builder constrainPointWise(ParameterSpace> constraints){ - this.pointWiseConstraints = constraints; - return this; - } - - public Builder depthMultiplier(int depthMultiplier){ - return depthMultiplier(new FixedValue<>(depthMultiplier)); - } - - public Builder depthMultiplier(ParameterSpace depthMultiplier){ - this.depthMultiplier = depthMultiplier; - return this; - } - - public SeparableConvolution2DLayerSpace build(){ - return new SeparableConvolution2DLayerSpace(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/SubsamplingLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/SubsamplingLayerSpace.java deleted file mode 100644 index a2ec4b1f6..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/SubsamplingLayerSpace.java +++ /dev/null @@ -1,210 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.ConvolutionMode; -import org.deeplearning4j.nn.conf.layers.SubsamplingLayer; - -/** - * Layer hyperparameter configuration space for subsampling layers - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class SubsamplingLayerSpace extends LayerSpace { - - protected ParameterSpace convolutionMode; - protected ParameterSpace poolingType; - protected ParameterSpace dilation; - protected ParameterSpace kernelSize; - protected ParameterSpace stride; - protected ParameterSpace padding; - protected ParameterSpace pnorm; - protected ParameterSpace eps; - - private SubsamplingLayerSpace(Builder builder) { - super(builder); - this.convolutionMode = builder.convolutionMode; - this.poolingType = builder.poolingType; - this.kernelSize = builder.kernelSize; - this.dilation = builder.dilation; - this.stride = builder.stride; - this.padding = builder.padding; - this.pnorm = builder.pnorm; - this.eps = builder.eps; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public SubsamplingLayer getValue(double[] values) { - SubsamplingLayer.Builder b = new SubsamplingLayer.Builder(); - setLayerOptionsBuilder(b, values); - return b.build(); - } - - protected void setLayerOptionsBuilder(SubsamplingLayer.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (convolutionMode != null) - builder.convolutionMode(convolutionMode.getValue(values)); - if (poolingType != null) - builder.poolingType(poolingType.getValue(values)); - if (dilation !=null) - builder.dilation(dilation.getValue(values)); - if (kernelSize != null) - builder.kernelSize(kernelSize.getValue(values)); - if (stride != null) - builder.stride(stride.getValue(values)); - if (padding != null) - builder.padding(padding.getValue(values)); - if(pnorm != null) - builder.pnorm(pnorm.getValue(values)); - if(eps != null) - builder.eps(eps.getValue(values)); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - StringBuilder sb = new StringBuilder("SubsamplingLayerSpace("); - if (convolutionMode != null) - sb.append("convolutionMode: ").append(convolutionMode).append(delim); - if (poolingType != null) - sb.append("poolingType: ").append(poolingType).append(delim); - if (dilation != null) - sb.append("dilation: ").append(dilation).append(delim); - if (kernelSize != null) - sb.append("kernelSize: ").append(kernelSize).append(delim); - if (stride != null) - sb.append("stride: ").append(stride).append(delim); - if (padding != null) - sb.append("padding: ").append(padding).append(delim); - if (pnorm != null) - sb.append("pnorm: ").append(pnorm).append(delim); - if (eps != null) - sb.append("eps: ").append(eps).append(delim); - sb.append(super.toString(delim)).append(")"); - return sb.toString(); - } - - - public static class Builder extends FeedForwardLayerSpace.Builder { - - protected ParameterSpace convolutionMode; - protected ParameterSpace poolingType; - protected ParameterSpace dilation; - protected ParameterSpace kernelSize; - protected ParameterSpace stride; - protected ParameterSpace padding; - protected ParameterSpace pnorm; - protected ParameterSpace eps; - - public Builder convolutionMode(ConvolutionMode convolutionMode){ - return convolutionMode(new FixedValue<>(convolutionMode)); - } - - public Builder convolutionMode(ParameterSpace convolutionMode){ - this.convolutionMode = convolutionMode; - return this; - } - - public Builder poolingType(SubsamplingLayer.PoolingType poolingType) { - return poolingType(new FixedValue<>(poolingType)); - } - - public Builder poolingType(ParameterSpace poolingType) { - this.poolingType = poolingType; - return this; - } - - public Builder dilation(int... dilation) { - return dilation(new FixedValue<>(dilation)); - } - - public Builder dilation(ParameterSpace dilation) { - this.dilation = dilation; - return this; - } - - public Builder kernelSize(int... kernelSize) { - return kernelSize(new FixedValue<>(kernelSize)); - } - - public Builder kernelSize(ParameterSpace kernelSize) { - this.kernelSize = kernelSize; - return this; - } - - public Builder stride(int... stride) { - return stride(new FixedValue(stride)); - } - - public Builder stride(ParameterSpace stride) { - this.stride = stride; - return this; - } - - public Builder padding(int... padding) { - return padding(new FixedValue(padding)); - } - - public Builder padding(ParameterSpace padding) { - this.padding = padding; - return this; - } - - public Builder pnorm(int pnorm){ - return pnorm(new FixedValue<>(pnorm)); - } - - public Builder pnorm(ParameterSpace pnorm){ - this.pnorm = pnorm; - return this; - } - - public Builder eps(double eps){ - return eps(new FixedValue<>(eps)); - } - - public Builder eps(ParameterSpace eps){ - this.eps = eps; - return this; - } - - @SuppressWarnings("unchecked") - public SubsamplingLayerSpace build() { - return new SubsamplingLayerSpace(this); - } - - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/VariationalAutoencoderLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/VariationalAutoencoderLayerSpace.java deleted file mode 100644 index 6095e5d43..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/VariationalAutoencoderLayerSpace.java +++ /dev/null @@ -1,184 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.layers.variational.LossFunctionWrapper; -import org.deeplearning4j.nn.conf.layers.variational.ReconstructionDistribution; -import org.deeplearning4j.nn.conf.layers.variational.VariationalAutoencoder; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; -import org.nd4j.linalg.lossfunctions.ILossFunction; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -/** - * Layer space for {@link VariationalAutoencoder} - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PRIVATE) //For Jackson JSON/YAML deserialization -public class VariationalAutoencoderLayerSpace extends BasePretrainNetworkLayerSpace { - - private ParameterSpace encoderLayerSizes; - private ParameterSpace decoderLayerSizes; - private ParameterSpace outputDistribution; - private ParameterSpace pzxActivationFn; - private ParameterSpace numSamples; - - protected VariationalAutoencoderLayerSpace(Builder builder) { - super(builder); - - this.encoderLayerSizes = builder.encoderLayerSizes; - this.decoderLayerSizes = builder.decoderLayerSizes; - this.outputDistribution = builder.outputDistribution; - this.pzxActivationFn = builder.pzxActivationFn; - this.numSamples = builder.numSamples; - - this.numParameters = LeafUtils.countUniqueParameters(collectLeaves()); - } - - @Override - public VariationalAutoencoder getValue(double[] parameterValues) { - VariationalAutoencoder.Builder b = new VariationalAutoencoder.Builder(); - setLayerOptionsBuilder(b, parameterValues); - return b.build(); - } - - protected void setLayerOptionsBuilder(VariationalAutoencoder.Builder builder, double[] values) { - super.setLayerOptionsBuilder(builder, values); - if (encoderLayerSizes != null) - builder.encoderLayerSizes(encoderLayerSizes.getValue(values)); - if (decoderLayerSizes != null) - builder.decoderLayerSizes(decoderLayerSizes.getValue(values)); - if (outputDistribution != null) - builder.reconstructionDistribution(outputDistribution.getValue(values)); - if (pzxActivationFn != null) - builder.pzxActivationFn(pzxActivationFn.getValue(values)); - if (numSamples != null) - builder.numSamples(numSamples.getValue(values)); - } - - @Override - public String toString() { - return toString(", "); - } - - @Override - public String toString(String delim) { - StringBuilder sb = new StringBuilder("VariationalAutoencoderLayerSpace("); - if (encoderLayerSizes != null) - sb.append("encoderLayerSizes: ").append(encoderLayerSizes).append(delim); - if (decoderLayerSizes != null) - sb.append("decoderLayerSizes: ").append(decoderLayerSizes).append(delim); - if (outputDistribution != null) - sb.append("reconstructionDistribution: ").append(outputDistribution).append(delim); - if (pzxActivationFn != null) - sb.append("pzxActivationFn: ").append(pzxActivationFn).append(delim); - if (numSamples != null) - sb.append("numSamples: ").append(numSamples).append(delim); - sb.append(super.toString(delim)).append(")"); - return sb.toString(); - } - - public static class Builder extends BasePretrainNetworkLayerSpace.Builder { - - private ParameterSpace encoderLayerSizes; - private ParameterSpace decoderLayerSizes; - private ParameterSpace outputDistribution; - private ParameterSpace pzxActivationFn; - private ParameterSpace numSamples; - - - public Builder encoderLayerSizes(int... encoderLayerSizes) { - return encoderLayerSizes(new FixedValue<>(encoderLayerSizes)); - } - - public Builder encoderLayerSizes(ParameterSpace encoderLayerSizes) { - this.encoderLayerSizes = encoderLayerSizes; - return this; - } - - public Builder decoderLayerSizes(int... decoderLayerSizes) { - return decoderLayerSizes(new FixedValue<>(decoderLayerSizes)); - } - - public Builder decoderLayerSizes(ParameterSpace decoderLayerSizes) { - this.decoderLayerSizes = decoderLayerSizes; - return this; - } - - public Builder reconstructionDistribution(ReconstructionDistribution distribution) { - return reconstructionDistribution(new FixedValue<>(distribution)); - } - - public Builder reconstructionDistribution(ParameterSpace distribution) { - this.outputDistribution = distribution; - return this; - } - - public Builder lossFunction(IActivation outputActivationFn, LossFunctions.LossFunction lossFunction) { - return lossFunction(outputActivationFn, lossFunction.getILossFunction()); - } - - public Builder lossFunction(Activation outputActivationFn, LossFunctions.LossFunction lossFunction) { - return lossFunction(outputActivationFn.getActivationFunction(), lossFunction.getILossFunction()); - } - - public Builder lossFunction(IActivation outputActivationFn, ILossFunction lossFunction) { - return reconstructionDistribution(new LossFunctionWrapper(outputActivationFn, lossFunction)); - } - - public Builder pzxActivationFn(IActivation activationFunction) { - return pzxActivationFn(new FixedValue<>(activationFunction)); - } - - public Builder pzxActivationFn(ParameterSpace activationFunction) { - this.pzxActivationFn = activationFunction; - return this; - } - - public Builder pzxActivationFunction(Activation activation) { - return pzxActivationFn(activation.getActivationFunction()); - } - - public Builder numSamples(int numSamples) { - return numSamples(new FixedValue<>(numSamples)); - } - - public Builder numSamples(ParameterSpace numSamples) { - this.numSamples = numSamples; - return this; - } - - - @Override - public E build() { - return (E) new VariationalAutoencoderLayerSpace(this); - } - - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/fixed/FixedLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/fixed/FixedLayerSpace.java deleted file mode 100644 index 6fdf4d847..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/layers/fixed/FixedLayerSpace.java +++ /dev/null @@ -1,73 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.layers.fixed; - -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.layers.LayerSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.nn.conf.layers.Layer; - -import java.util.Collections; -import java.util.List; - -/** - * A layer space that wraps a DL4J layer, without any optimizable hyperparameters - * - * @param Type of layer - * - * @author Alex Black - */ -@AllArgsConstructor -@NoArgsConstructor -@Data -@EqualsAndHashCode(callSuper = false) -public class FixedLayerSpace extends LayerSpace { - - protected T layer; - - @Override - public T getValue(double[] parameterValues) { - return (T)layer.clone(); - } - - @Override - public int numParameters() { - return 0; - } - - @Override - public boolean isLeaf() { - return true; - } - - @Override - public void setIndices(int[] idxs){ - if(idxs != null && idxs.length > 0){ - throw new IllegalStateException("Cannot set indices: no parameters"); - } - } - - @Override - public List collectLeaves() { - return Collections.singletonList(this); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/listener/DL4JArbiterStatusReportingListener.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/listener/DL4JArbiterStatusReportingListener.java deleted file mode 100644 index 013bbec5b..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/listener/DL4JArbiterStatusReportingListener.java +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.listener; - -import lombok.AllArgsConstructor; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; -import org.deeplearning4j.nn.api.Model; -import org.deeplearning4j.optimize.api.BaseTrainingListener; - -import java.util.List; - -/** - * A simple DL4J Iteration listener that calls Arbiter's status listeners - * - * @author Alex Black - */ -@AllArgsConstructor -public class DL4JArbiterStatusReportingListener extends BaseTrainingListener { - - private List statusListeners; - private CandidateInfo candidateInfo; - - @Override - public void iterationDone(Model model, int iteration, int epoch) { - if (statusListeners == null) { - return; - } - - for (StatusListener sl : statusListeners) { - sl.onCandidateIteration(candidateInfo, model, iteration); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/saver/local/FileModelSaver.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/saver/local/FileModelSaver.java deleted file mode 100644 index 82d2270be..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/saver/local/FileModelSaver.java +++ /dev/null @@ -1,149 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.saver.local; - -import lombok.AllArgsConstructor; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import lombok.NonNull; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.io.FileUtils; -import org.apache.commons.io.FilenameUtils; -import org.deeplearning4j.arbiter.DL4JConfiguration; -import org.deeplearning4j.arbiter.GraphConfiguration; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultSaver; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.nn.api.Model; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.deeplearning4j.util.ModelSerializer; -import org.nd4j.shade.jackson.annotation.JsonCreator; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.io.*; -import java.util.Arrays; -import java.util.Collections; -import java.util.List; - -/** - * Basic MultiLayerNetwork saver. Saves config, parameters and score to: baseDir/0/, baseDir/1/, etc - * where index is given by OptimizationResult.getIndex() - * - * @author Alex Black - */ -@Slf4j -@NoArgsConstructor -@AllArgsConstructor -@EqualsAndHashCode -public class FileModelSaver implements ResultSaver { - @JsonProperty - private String path; - private File fPath; - - @JsonCreator - public FileModelSaver(@NonNull String path) { - this(new File(path)); - } - - public FileModelSaver(@NonNull File file){ - this.path = file.getPath(); - this.fPath = file; - - if(!fPath.exists()){ - fPath.mkdirs(); - } else if (!fPath.isDirectory()) { - throw new IllegalArgumentException("Invalid path: exists and is not directory. " + path); - } - - log.info("FileModelSaver saving networks to local directory: {}", path); - } - - @Override - public ResultReference saveModel(OptimizationResult result, Object modelResult) throws IOException { - String dir = new File(path, result.getIndex() + "/").getAbsolutePath(); - - File f = new File(dir); - f.mkdir(); - - File modelFile = new File(FilenameUtils.concat(dir, "model.bin")); - File scoreFile = new File(FilenameUtils.concat(dir, "score.txt")); - File additionalResultsFile = new File(FilenameUtils.concat(dir, "additionalResults.bin")); - File esConfigFile = new File(FilenameUtils.concat(dir, "earlyStoppingConfig.bin")); - File numEpochsFile = new File(FilenameUtils.concat(dir, "numEpochs.txt")); - - FileUtils.writeStringToFile(scoreFile, String.valueOf(result.getScore())); - - Model m = (Model) modelResult; - ModelSerializer.writeModel(m, modelFile, true); - - - Object additionalResults = result.getModelSpecificResults(); - if (additionalResults instanceof Serializable) { - try (ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream(additionalResultsFile))) { - oos.writeObject(additionalResults); - } - } - - //Write early stopping configuration (if present) to file: - int nEpochs; - EarlyStoppingConfiguration esc; - if (result.getCandidate().getValue() instanceof DL4JConfiguration) { - DL4JConfiguration c = ((DL4JConfiguration) result.getCandidate().getValue()); - esc = c.getEarlyStoppingConfiguration(); - nEpochs = c.getNumEpochs(); - } else { - GraphConfiguration c = ((GraphConfiguration) result.getCandidate().getValue()); - esc = c.getEarlyStoppingConfiguration(); - nEpochs = c.getNumEpochs(); - } - - - if (esc != null) { - try (ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream(esConfigFile))) { - oos.writeObject(esc); - } - } else { - FileUtils.writeStringToFile(numEpochsFile, String.valueOf(nEpochs)); - } - - log.debug("Deeplearning4j model result (id={}, score={}) saved to directory: {}", result.getIndex(), - result.getScore(), dir); - - boolean isGraph = m instanceof ComputationGraph; - return new LocalFileNetResultReference(result.getIndex(), dir, isGraph, modelFile, scoreFile, - additionalResultsFile, esConfigFile, numEpochsFile, result.getCandidate()); - } - - @Override - public List> getSupportedCandidateTypes() { - return Collections.>singletonList(Object.class); - } - - @Override - public List> getSupportedModelTypes() { - return Arrays.>asList(MultiLayerNetwork.class, ComputationGraph.class); - } - - @Override - public String toString() { - return "FileModelSaver(path=" + path + ")"; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/saver/local/LocalFileNetResultReference.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/saver/local/LocalFileNetResultReference.java deleted file mode 100644 index ce2ac443a..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/saver/local/LocalFileNetResultReference.java +++ /dev/null @@ -1,105 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.saver.local; - -import lombok.AllArgsConstructor; -import org.apache.commons.io.FileUtils; -import org.deeplearning4j.arbiter.DL4JConfiguration; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.nn.api.Model; -import org.deeplearning4j.util.ModelSerializer; - -import java.io.File; -import java.io.FileInputStream; -import java.io.IOException; -import java.io.ObjectInputStream; - -/** - * Result reference for MultiLayerNetworks and ComputationGraphs saved to local file system - */ -@AllArgsConstructor -public class LocalFileNetResultReference implements ResultReference { - - private int index; - private String dir; - private boolean isGraph; - private File modelFile; - private File scoreFile; - private File additionalResultsFile; - private File esConfigFile; - private File numEpochsFile; - private Candidate candidate; - - @Override - public OptimizationResult getResult() throws IOException { - - - String scoreStr = FileUtils.readFileToString(scoreFile); - //TODO: properly parsing. Probably want to store additional info other than just score... - double d = Double.parseDouble(scoreStr); - - EarlyStoppingConfiguration earlyStoppingConfiguration = null; - if (esConfigFile != null && esConfigFile.exists()) { - try (ObjectInputStream ois = new ObjectInputStream(new FileInputStream(esConfigFile))) { - earlyStoppingConfiguration = (EarlyStoppingConfiguration) ois.readObject(); - } catch (ClassNotFoundException e) { - throw new RuntimeException("Error loading early stopping configuration", e); - } - } - int nEpochs = 1; - if (numEpochsFile != null && numEpochsFile.exists()) { - String numEpochs = FileUtils.readFileToString(numEpochsFile); - nEpochs = Integer.parseInt(numEpochs); - } - - - - Object additionalResults; - if (additionalResultsFile.exists()) { - try (ObjectInputStream ois = new ObjectInputStream(new FileInputStream(additionalResultsFile))) { - additionalResults = ois.readObject(); - } catch (ClassNotFoundException e) { - throw new RuntimeException("Error loading additional results", e); - } - } else { - additionalResults = null; - } - - return new OptimizationResult(candidate, d, index, additionalResults, null, this); - } - - @Override - public Object getResultModel() throws IOException { - Model m; - if (isGraph) { - m = ModelSerializer.restoreComputationGraph(modelFile, false); - } else { - m = ModelSerializer.restoreMultiLayerNetwork(modelFile, false); - } - return m; - } - - @Override - public String toString() { - return "LocalFileNetResultReference(" + dir + ")"; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/RegressionValue.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/RegressionValue.java deleted file mode 100644 index 3828e0766..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/RegressionValue.java +++ /dev/null @@ -1,34 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring; - -/** - * Enumeration used to select the type of regression statistics to optimize on, with the various regression score functions - * - MSE: mean squared error
- * - MAE: mean absolute error
- * - RMSE: root mean squared error
- * - RSE: relative squared error
- * - CorrCoeff: correlation coefficient
- * - * @deprecated Use {@link org.deeplearning4j.eval.RegressionEvaluation.Metric} - */ -@Deprecated -public enum RegressionValue { - MSE, MAE, RMSE, RSE, CorrCoeff -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/ScoreFunctions.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/ScoreFunctions.java deleted file mode 100644 index 112a62fd1..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/ScoreFunctions.java +++ /dev/null @@ -1,68 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring; - - -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.scoring.impl.TestSetAccuracyScoreFunction; -import org.deeplearning4j.arbiter.scoring.impl.TestSetF1ScoreFunction; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.arbiter.scoring.impl.TestSetRegressionScoreFunction; - -/** - * ScoreFunctions provides static methods for getting score functions for DL4J MultiLayerNetwork and ComputationGraph - * - * @author Alex Black - */ -public class ScoreFunctions { - - private ScoreFunctions() {} - - /** - * Calculate the loss (score/loss function value) on a test set, for a MultiLayerNetwork - * - * @param average Average (divide by number of examples) - */ - public static ScoreFunction testSetLoss(boolean average) { - return new TestSetLossScoreFunction(average); - } - - /** - * Calculate the accuracy on a test set, for a MultiLayerNetwork - */ - public static ScoreFunction testSetAccuracy() { - return new TestSetAccuracyScoreFunction(); - } - - - /** - * Calculate the f1 score on a test set - */ - public static ScoreFunction testSetF1() { - return new TestSetF1ScoreFunction(); - } - - /** - * Calculate a regression value (MSE, MAE etc) on a test set - */ - public static ScoreFunction testSetRegression(RegressionValue regressionValue) { - return new TestSetRegressionScoreFunction(regressionValue); - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/BaseNetScoreFunction.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/BaseNetScoreFunction.java deleted file mode 100644 index b3c2a3de4..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/BaseNetScoreFunction.java +++ /dev/null @@ -1,104 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.impl; - -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIteratorFactory; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -import java.util.Arrays; -import java.util.List; -import java.util.Map; -import java.util.Properties; - -/** - * Created by Alex on 23/07/2017. - */ -@EqualsAndHashCode -public abstract class BaseNetScoreFunction implements ScoreFunction { - - - @Override - public double score(Object model, DataProvider dataProvider, Map dataParameters) { - Object testData = dataProvider.testData(dataParameters); - return score(model, testData); - } - - @Override - public double score(Object model, Class dataSource, Properties dataSourceProperties) { - DataSource ds; - try{ - ds = dataSource.newInstance(); - if (dataSourceProperties != null) { - ds.configure(dataSourceProperties); - } - } catch (Exception e){ - throw new RuntimeException("Error creating DataSource instance - missing no-arg constructor?", e); - } - return score(model, ds.testData()); - } - - protected double score(Object model, Object testData){ - if (model instanceof MultiLayerNetwork) { - if (testData instanceof DataSetIterator) { - return score((MultiLayerNetwork) model, (DataSetIterator) testData); - } else if(testData instanceof MultiDataSetIterator){ - return score((MultiLayerNetwork) model, (MultiDataSetIterator) testData); - } else if(testData instanceof DataSetIteratorFactory){ - return score((MultiLayerNetwork)model, ((DataSetIteratorFactory)testData).create()); - } else { - throw new RuntimeException("Unknown type of data: " + testData.getClass()); - } - } else { - if (testData instanceof DataSetIterator) { - return score((ComputationGraph) model, (DataSetIterator) testData); - } else if(testData instanceof DataSetIteratorFactory){ - return score((ComputationGraph) model, ((DataSetIteratorFactory)testData).create()); - } else if(testData instanceof MultiDataSetIterator) { - return score((ComputationGraph) model, (MultiDataSetIterator) testData); - } else { - throw new RuntimeException("Unknown type of data: " + testData.getClass()); - } - } - } - - @Override - public List> getSupportedModelTypes() { - return Arrays.>asList(MultiLayerNetwork.class, ComputationGraph.class); - } - - @Override - public List> getSupportedDataTypes() { - return Arrays.>asList(DataSetIterator.class, MultiDataSetIterator.class); - } - - public abstract double score(MultiLayerNetwork net, DataSetIterator iterator); - - public abstract double score(MultiLayerNetwork net, MultiDataSetIterator iterator); - - public abstract double score(ComputationGraph graph, DataSetIterator iterator); - - public abstract double score(ComputationGraph graph, MultiDataSetIterator iterator); -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/EvaluationScoreFunction.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/EvaluationScoreFunction.java deleted file mode 100644 index 15fd9cf0b..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/EvaluationScoreFunction.java +++ /dev/null @@ -1,88 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.impl; - -import lombok.*; -import org.deeplearning4j.datasets.iterator.MultiDataSetWrapperIterator; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.evaluation.classification.Evaluation; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -/** - * Score function that calculates an evaluation {@link Evaluation.Metric} on the test set for a - * {@link MultiLayerNetwork} or {@link ComputationGraph} - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //JSON -public class EvaluationScoreFunction extends BaseNetScoreFunction { - - protected Evaluation.Metric metric; - - /** - * @param metric Evaluation metric to calculate - */ - public EvaluationScoreFunction(@NonNull org.deeplearning4j.eval.Evaluation.Metric metric) { - this(metric.toNd4j()); - } - - /** - * @param metric Evaluation metric to calculate - */ - public EvaluationScoreFunction(@NonNull Evaluation.Metric metric) { - this.metric = metric; - } - - @Override - public String toString() { - return "EvaluationScoreFunction(metric=" + metric + ")"; - } - - @Override - public double score(MultiLayerNetwork net, DataSetIterator iterator) { - Evaluation e = net.evaluate(iterator); - return e.scoreForMetric(metric); - } - - @Override - public double score(MultiLayerNetwork net, MultiDataSetIterator iterator) { - return score(net, new MultiDataSetWrapperIterator(iterator)); - } - - @Override - public double score(ComputationGraph graph, DataSetIterator iterator) { - Evaluation e = graph.evaluate(iterator); - return e.scoreForMetric(metric); - } - - @Override - public double score(ComputationGraph graph, MultiDataSetIterator iterator) { - Evaluation e = graph.evaluate(iterator); - return e.scoreForMetric(metric); - } - - @Override - public boolean minimize() { - return false; //Want to maximize all evaluation metrics: Accuracy, F1, precision, recall, g-measure, mcc - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/ROCScoreFunction.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/ROCScoreFunction.java deleted file mode 100644 index fd4609ea2..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/ROCScoreFunction.java +++ /dev/null @@ -1,124 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.impl; - -import lombok.*; -import org.deeplearning4j.datasets.iterator.MultiDataSetWrapperIterator; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.evaluation.classification.ROC; -import org.nd4j.evaluation.classification.ROCBinary; -import org.nd4j.evaluation.classification.ROCMultiClass; -import org.nd4j.linalg.dataset.adapter.MultiDataSetIteratorAdapter; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -/** - * Score function that calculates AUC (area under ROC curve) or AUPRC (area under precision/recall curve) on a test set - * for a {@link MultiLayerNetwork} or {@link ComputationGraph} - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //JSON -public class ROCScoreFunction extends BaseNetScoreFunction { - - /** - * Type of ROC evaluation to perform:
- * ROC: use {@link ROC} to perform evaluation (single output binary classification)
- * BINARY: use {@link ROCBinary} to perform evaluation (multi-output/multi-task binary classification)
- * MULTICLASS: use {@link ROCMultiClass} to perform evaluation (1 vs. all multi-class classification) - * - */ - public enum ROCType {ROC, BINARY, MULTICLASS} - - /** - * Metric to calculate.
- * AUC: Area under ROC curve
- * AUPRC: Area under precision/recall curve - */ - public enum Metric {AUC, AUPRC} - - protected ROCType type; - protected Metric metric; - - /** - * @param type ROC type to use for evaluation - * @param metric Evaluation metric to calculate - */ - public ROCScoreFunction(@NonNull ROCType type, @NonNull Metric metric) { - this.type = type; - this.metric = metric; - } - - @Override - public String toString() { - return "ROCScoreFunction(type=" + type + ",metric=" + metric + ")"; - } - - @Override - public double score(MultiLayerNetwork net, DataSetIterator iterator) { - switch (type){ - case ROC: - ROC r = net.evaluateROC(iterator); - return metric == Metric.AUC ? r.calculateAUC() : r.calculateAUCPR(); - case BINARY: - ROCBinary r2 = net.doEvaluation(iterator, new ROCBinary())[0]; - return metric == Metric.AUC ? r2.calculateAverageAuc() : r2.calculateAverageAUCPR(); - case MULTICLASS: - ROCMultiClass r3 = net.evaluateROCMultiClass(iterator); - return metric == Metric.AUC ? r3.calculateAverageAUC() : r3.calculateAverageAUCPR(); - default: - throw new RuntimeException("Unknown type: " + type); - } - } - - @Override - public double score(MultiLayerNetwork net, MultiDataSetIterator iterator) { - return score(net, new MultiDataSetWrapperIterator(iterator)); - } - - @Override - public double score(ComputationGraph graph, DataSetIterator iterator) { - return score(graph, new MultiDataSetIteratorAdapter(iterator)); - } - - @Override - public double score(ComputationGraph net, MultiDataSetIterator iterator) { - switch (type){ - case ROC: - ROC r = net.evaluateROC(iterator); - return metric == Metric.AUC ? r.calculateAUC() : r.calculateAUCPR(); - case BINARY: - ROCBinary r2 = net.doEvaluation(iterator, new ROCBinary())[0]; - return metric == Metric.AUC ? r2.calculateAverageAuc() : r2.calculateAverageAUCPR(); - case MULTICLASS: - ROCMultiClass r3 = net.evaluateROCMultiClass(iterator, 0); - return metric == Metric.AUC ? r3.calculateAverageAUC() : r3.calculateAverageAUCPR(); - default: - throw new RuntimeException("Unknown type: " + type); - } - } - - @Override - public boolean minimize() { - return false; //Want to maximize all evaluation metrics: Accuracy, F1, precision, recall, g-measure, mcc - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/RegressionScoreFunction.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/RegressionScoreFunction.java deleted file mode 100644 index 81618662f..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/RegressionScoreFunction.java +++ /dev/null @@ -1,94 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.impl; - -import lombok.*; -import org.deeplearning4j.datasets.iterator.MultiDataSetWrapperIterator; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.evaluation.regression.RegressionEvaluation; -import org.nd4j.evaluation.regression.RegressionEvaluation.Metric; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -/** - * Score function for regression (including multi-label regression) for a MultiLayerNetwork or ComputationGraph - * on a test set. Supports all regression metrics: {@link Metric} - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For JSON -public class RegressionScoreFunction extends BaseNetScoreFunction { - - protected Metric metric; - - public RegressionScoreFunction(@NonNull org.deeplearning4j.eval.RegressionEvaluation.Metric metric) { - this(metric.toNd4j()); - } - - public RegressionScoreFunction(@NonNull Metric metric) { - this.metric = metric; - } - - @Override - public boolean minimize() { - switch (metric) { - case MSE: - case MAE: - case RMSE: - case RSE: - return true; - case PC: - case R2: - return false; - default: - throw new IllegalStateException("Unknown metric: " + metric); - } - } - - @Override - public String toString() { - return "RegressionScoreFunction(metric=" + metric + ")"; - } - - @Override - public double score(MultiLayerNetwork net, DataSetIterator iterator) { - RegressionEvaluation e = net.evaluateRegression(iterator); - return e.scoreForMetric(metric); - } - - @Override - public double score(MultiLayerNetwork net, MultiDataSetIterator iterator) { - return score(net, new MultiDataSetWrapperIterator(iterator)); - } - - @Override - public double score(ComputationGraph graph, DataSetIterator iterator) { - RegressionEvaluation e = graph.evaluateRegression(iterator); - return e.scoreForMetric(metric); - } - - @Override - public double score(ComputationGraph graph, MultiDataSetIterator iterator) { - RegressionEvaluation e = graph.evaluateRegression(iterator); - return e.scoreForMetric(metric); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetAccuracyScoreFunction.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetAccuracyScoreFunction.java deleted file mode 100644 index b825f5432..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetAccuracyScoreFunction.java +++ /dev/null @@ -1,74 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.impl; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.eval.Evaluation; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -/** - * Score function that calculates the accuracy on a - * test set for a {@link MultiLayerNetwork} or {@link ComputationGraph} - * - * @author Alex Black - * @deprecated Use {@link EvaluationScoreFunction} - */ -@Data -@EqualsAndHashCode(callSuper = true) -@Deprecated -public class TestSetAccuracyScoreFunction extends BaseNetScoreFunction { - - - @Override - public String toString() { - return "TestSetAccuracyScoreFunction()"; - } - - @Override - public double score(MultiLayerNetwork net, DataSetIterator iterator) { - Evaluation e = net.evaluate(iterator); - return e.accuracy(); - } - - @Override - public double score(MultiLayerNetwork net, MultiDataSetIterator iterator) { - throw new UnsupportedOperationException("Cannot evaluate MultiLayerNetwork on MultiDataSetIterator"); - } - - @Override - public double score(ComputationGraph graph, DataSetIterator iterator) { - Evaluation e = graph.evaluate(iterator); - return e.accuracy(); - } - - @Override - public double score(ComputationGraph graph, MultiDataSetIterator iterator) { - Evaluation e = graph.evaluate(iterator); - return e.accuracy(); - } - - @Override - public boolean minimize() { - return false; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetF1ScoreFunction.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetF1ScoreFunction.java deleted file mode 100644 index 36e4d25f4..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetF1ScoreFunction.java +++ /dev/null @@ -1,74 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.impl; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.eval.Evaluation; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -/** - * Score function that calculates the F1 score - * on a test set for a {@link MultiLayerNetwork} or {@link ComputationGraph} - * - * @author Alex Black - * @deprecated Use {@link EvaluationScoreFunction} - */ -@Data -@EqualsAndHashCode(callSuper = true) -@Deprecated -public class TestSetF1ScoreFunction extends BaseNetScoreFunction { - - @Override - public boolean minimize() { - return false; //false -> maximize - } - - - @Override - public String toString() { - return "TestSetF1ScoreFunction"; - } - - @Override - public double score(MultiLayerNetwork net, DataSetIterator iterator) { - Evaluation e = net.evaluate(iterator); - return e.f1(); - } - - @Override - public double score(MultiLayerNetwork net, MultiDataSetIterator iterator) { - throw new UnsupportedOperationException("Cannot evaluate MultiLayerNetwork on MultiDataSetIterator"); - } - - @Override - public double score(ComputationGraph graph, DataSetIterator iterator) { - Evaluation e = graph.evaluate(iterator); - return e.f1(); - } - - @Override - public double score(ComputationGraph graph, MultiDataSetIterator iterator) { - Evaluation e = graph.evaluate(iterator); - return e.f1(); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetLossScoreFunction.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetLossScoreFunction.java deleted file mode 100644 index 1e4be0557..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetLossScoreFunction.java +++ /dev/null @@ -1,80 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.impl; - -import lombok.Data; -import lombok.EqualsAndHashCode; -import org.deeplearning4j.arbiter.scoring.util.ScoreUtil; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -/** - * Score function that calculates the test set loss - * on a test set for a {@link MultiLayerNetwork} or {@link ComputationGraph} - * - * @author Alex Black - */ -@Data -@EqualsAndHashCode(callSuper = false) -public class TestSetLossScoreFunction extends BaseNetScoreFunction { - @JsonProperty - private final boolean average; - - public TestSetLossScoreFunction() { - this(true); - } - - public TestSetLossScoreFunction(boolean average) { - this.average = average; - } - - - @Override - public boolean minimize() { - return true; - } - - @Override - public String toString() { - return "TestSetLossScoreFunction()"; - } - - @Override - public double score(MultiLayerNetwork net, DataSetIterator iterator) { - return ScoreUtil.score(net, iterator, average); - } - - @Override - public double score(MultiLayerNetwork net, MultiDataSetIterator iterator) { - throw new UnsupportedOperationException("Cannot evaluate MultiLayerNetwork on MultiDataSetIterator"); - } - - @Override - public double score(ComputationGraph graph, DataSetIterator iterator) { - return ScoreUtil.score(graph, iterator, average); - } - - @Override - public double score(ComputationGraph graph, MultiDataSetIterator iterator) { - return ScoreUtil.score(graph, iterator, average); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetRegressionScoreFunction.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetRegressionScoreFunction.java deleted file mode 100644 index 7aa58fae6..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/impl/TestSetRegressionScoreFunction.java +++ /dev/null @@ -1,87 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.impl; - -import lombok.AccessLevel; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; -import org.deeplearning4j.arbiter.scoring.RegressionValue; -import org.deeplearning4j.arbiter.scoring.util.ScoreUtil; -import org.deeplearning4j.eval.RegressionEvaluation; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; - -/** - * Score function for regression (including multi-label regression) for a MultiLayerNetwork or ComputationGraph - * on a test set - * - * @author Alex Black - * @deprecated Use {@link RegressionScoreFunction} - */ -@Data -@EqualsAndHashCode(callSuper = true) -@NoArgsConstructor(access = AccessLevel.PROTECTED) //For JSON -@Deprecated -public class TestSetRegressionScoreFunction extends BaseNetScoreFunction { - private RegressionValue regressionValue; - - /** - * @param regressionValue The type of evaluation to do: MSE, MAE, RMSE, etc - */ - public TestSetRegressionScoreFunction(RegressionValue regressionValue) { - this.regressionValue = regressionValue; - } - - - @Override - public boolean minimize() { - return regressionValue != RegressionValue.CorrCoeff; //Maximize correlation coefficient, minimize the remaining ones - } - - @Override - public String toString() { - return "TestSetRegressionScoreFunction(type=" + regressionValue + ")"; - } - - @Override - public double score(MultiLayerNetwork net, DataSetIterator iterator) { - RegressionEvaluation e = net.evaluateRegression(iterator); - return ScoreUtil.getScoreFromRegressionEval(e, regressionValue); - } - - @Override - public double score(MultiLayerNetwork net, MultiDataSetIterator iterator) { - throw new UnsupportedOperationException("Cannot evaluate MultiLayerNetwork on MultiDataSetIterator"); - } - - @Override - public double score(ComputationGraph graph, DataSetIterator iterator) { - RegressionEvaluation e = graph.evaluateRegression(iterator); - return ScoreUtil.getScoreFromRegressionEval(e, regressionValue); - } - - @Override - public double score(ComputationGraph graph, MultiDataSetIterator iterator) { - RegressionEvaluation e = graph.evaluateRegression(iterator); - return ScoreUtil.getScoreFromRegressionEval(e, regressionValue); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/util/ScoreUtil.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/util/ScoreUtil.java deleted file mode 100644 index 6d9646474..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/scoring/util/ScoreUtil.java +++ /dev/null @@ -1,330 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.scoring.util; - -import org.deeplearning4j.arbiter.scoring.RegressionValue; -import org.deeplearning4j.eval.Evaluation; -import org.deeplearning4j.eval.RegressionEvaluation; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.DataSet; -import org.nd4j.linalg.dataset.adapter.MultiDataSetIteratorAdapter; -import org.nd4j.linalg.dataset.api.MultiDataSet; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIteratorFactory; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIteratorFactory; - - - -/** - * Various utilities for functions used in arbiter. - * - * @author Adam Gibson - */ -public class ScoreUtil { - - - - /** - * Get a {@link DataSetIterator} - * from the given object whether it's a {@link DataSetIterator} - * or {@link DataSetIteratorFactory}, any other type will throw - * an {@link IllegalArgumentException} - * @param o the object to get the iterator from - * @return the datasetiterator from the given objects - */ - public static MultiDataSetIterator getMultiIterator(Object o) { - if (o instanceof MultiDataSetIterator) { - return (MultiDataSetIterator) o; - } else if (o instanceof MultiDataSetIteratorFactory) { - MultiDataSetIteratorFactory factory = (MultiDataSetIteratorFactory) o; - return factory.create(); - } else if( o instanceof DataSetIterator ){ - return new MultiDataSetIteratorAdapter((DataSetIterator)o); - } else if( o instanceof DataSetIteratorFactory ){ - return new MultiDataSetIteratorAdapter(((DataSetIteratorFactory)o).create()); - } - - throw new IllegalArgumentException("Type must either be DataSetIterator or DataSetIteratorFactory"); - } - - - /** - * Get a {@link DataSetIterator} - * from the given object whether it's a {@link DataSetIterator} - * or {@link DataSetIteratorFactory}, any other type will throw - * an {@link IllegalArgumentException} - * @param o the object to get the iterator from - * @return the datasetiterator from the given objects - */ - public static DataSetIterator getIterator(Object o) { - if (o instanceof DataSetIterator) - return (DataSetIterator) o; - else if (o instanceof DataSetIteratorFactory) { - DataSetIteratorFactory factory = (DataSetIteratorFactory) o; - return factory.create(); - } - - throw new IllegalArgumentException("Type must either be DataSetIterator or DataSetIteratorFactory"); - } - - /** - * - * @param model - * @param testData - * @return - */ - public static Evaluation getEvaluation(MultiLayerNetwork model, DataSetIterator testData) { - return model.evaluate(testData); - } - - /** - * Get the evaluation - * for the given model and test dataset - * @param model the model to get the evaluation from - * @param testData the test data to do the evaluation on - * @return the evaluation object with accumulated statistics - * for the current test data - */ - public static Evaluation getEvaluation(ComputationGraph model, MultiDataSetIterator testData) { - if (model.getNumOutputArrays() != 1) - throw new IllegalStateException("GraphSetSetAccuracyScoreFunction cannot be " - + "applied to ComputationGraphs with more than one output. NumOutputs = " - + model.getNumOutputArrays()); - - return model.evaluate(testData); - } - - - /** - * Get the evaluation - * for the given model and test dataset - * @param model the model to get the evaluation from - * @param testData the test data to do the evaluation on - * @return the evaluation object with accumulated statistics - * for the current test data - */ - public static Evaluation getEvaluation(ComputationGraph model, DataSetIterator testData) { - if (model.getNumOutputArrays() != 1) - throw new IllegalStateException("GraphSetSetAccuracyScoreFunctionDataSet cannot be " - + "applied to ComputationGraphs with more than one output. NumOutputs = " - + model.getNumOutputArrays()); - - return model.evaluate(testData); - } - - - - /** - * Score based on the loss function - * @param model the model to score with - * @param testData the test data to score - * @param average whether to average the score - * for the whole batch or not - * @return the score for the given test set - */ - public static double score(ComputationGraph model, MultiDataSetIterator testData, boolean average) { - //TODO: do this properly taking into account division by N, L1/L2 etc - double sumScore = 0.0; - int totalExamples = 0; - while (testData.hasNext()) { - MultiDataSet ds = testData.next(); - long numExamples = ds.getFeatures(0).size(0); - sumScore += numExamples * model.score(ds); - totalExamples += numExamples; - } - - if (!average) - return sumScore; - return sumScore / totalExamples; - } - - /** - * Score based on the loss function - * @param model the model to score with - * @param testData the test data to score - * @param average whether to average the score - * for the whole batch or not - * @return the score for the given test set - */ - public static double score(ComputationGraph model, DataSetIterator testData, boolean average) { - //TODO: do this properly taking into account division by N, L1/L2 etc - double sumScore = 0.0; - int totalExamples = 0; - while (testData.hasNext()) { - DataSet ds = testData.next(); - int numExamples = ds.numExamples(); - - sumScore += numExamples * model.score(ds); - totalExamples += numExamples; - } - - if (!average) - return sumScore; - return sumScore / totalExamples; - } - - - /** - * - * @param model - * @param testSet - * @param regressionValue - * @return - */ - public static double score(ComputationGraph model, MultiDataSetIterator testSet, RegressionValue regressionValue) { - int nOutputs = model.getNumOutputArrays(); - - RegressionEvaluation[] evaluations = new RegressionEvaluation[nOutputs]; - for (int i = 0; i < evaluations.length; i++) - evaluations[i] = new RegressionEvaluation(); - - while (testSet.hasNext()) { - MultiDataSet next = testSet.next(); - INDArray[] labels = next.getLabels(); - - if (next.hasMaskArrays()) { - INDArray[] fMasks = next.getFeaturesMaskArrays(); - INDArray[] lMasks = next.getLabelsMaskArrays(); - - model.setLayerMaskArrays(fMasks, lMasks); - - INDArray[] outputs = model.output(false, next.getFeatures()); - for (int i = 0; i < evaluations.length; i++) { - if (lMasks != null && lMasks[i] != null) { - evaluations[i].evalTimeSeries(labels[i], outputs[i], lMasks[i]); - } else { - evaluations[i].evalTimeSeries(labels[i], outputs[i]); - } - } - - model.clearLayerMaskArrays(); - } else { - INDArray[] outputs = model.output(false, next.getFeatures()); - for (int i = 0; i < evaluations.length; i++) { - if (labels[i].rank() == 3) { - evaluations[i].evalTimeSeries(labels[i], outputs[i]); - } else { - evaluations[i].eval(labels[i], outputs[i]); - } - } - } - } - - double sum = 0.0; - int totalColumns = 0; - for (int i = 0; i < evaluations.length; i++) { - int nColumns = evaluations[i].numColumns(); - totalColumns += nColumns; - sum += getScoreFromRegressionEval(evaluations[i], regressionValue); - } - if (regressionValue == RegressionValue.CorrCoeff) - sum /= totalColumns; - - return sum; - } - - - /** - * Run a {@link RegressionEvaluation} - * over a {@link DataSetIterator} - * @param model the model to use - * @param testSet the test set iterator - * @param regressionValue the regression type to use - * @return - */ - public static double score(ComputationGraph model, DataSetIterator testSet, RegressionValue regressionValue) { - RegressionEvaluation evaluation = model.evaluateRegression(testSet); - return getScoreFromRegressionEval(evaluation, regressionValue); - } - - - /** - * Score the given test data - * with the given multi layer network - * @param model model to use - * @param testData the test data to test with - * @param average whether to average the score or not - * @return the score for the given test data given the model - */ - public static double score(MultiLayerNetwork model, DataSetIterator testData, boolean average) { - //TODO: do this properly taking into account division by N, L1/L2 etc - double sumScore = 0.0; - int totalExamples = 0; - while (testData.hasNext()) { - DataSet ds = testData.next(); - int numExamples = ds.numExamples(); - - sumScore += numExamples * model.score(ds); - totalExamples += numExamples; - } - - if (!average) - return sumScore; - return sumScore / totalExamples; - } - - - /** - * Score the given multi layer network - * @param model the model to score - * @param testSet the test set - * @param regressionValue the regression function to use - * @return the score from the given test set - */ - public static double score(MultiLayerNetwork model, DataSetIterator testSet, RegressionValue regressionValue) { - RegressionEvaluation eval = model.evaluateRegression(testSet); - return getScoreFromRegressionEval(eval, regressionValue); - } - - - @Deprecated - public static double getScoreFromRegressionEval(RegressionEvaluation eval, RegressionValue regressionValue) { - double sum = 0.0; - int nColumns = eval.numColumns(); - switch (regressionValue) { - case MSE: - for (int i = 0; i < nColumns; i++) - sum += eval.meanSquaredError(i); - break; - case MAE: - for (int i = 0; i < nColumns; i++) - sum += eval.meanAbsoluteError(i); - break; - case RMSE: - for (int i = 0; i < nColumns; i++) - sum += eval.rootMeanSquaredError(i); - break; - case RSE: - for (int i = 0; i < nColumns; i++) - sum += eval.relativeSquaredError(i); - break; - case CorrCoeff: - for (int i = 0; i < nColumns; i++) - sum += eval.correlationR2(i); - sum /= nColumns; - break; - } - - return sum; - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/ComputationGraphTaskCreator.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/ComputationGraphTaskCreator.java deleted file mode 100644 index f346db1fe..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/ComputationGraphTaskCreator.java +++ /dev/null @@ -1,269 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.task; - -import lombok.AllArgsConstructor; -import lombok.Getter; -import lombok.NoArgsConstructor; -import lombok.Setter; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang3.exception.ExceptionUtils; -import org.deeplearning4j.arbiter.GraphConfiguration; -import org.deeplearning4j.arbiter.listener.DL4JArbiterStatusReportingListener; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.api.TaskCreator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.evaluation.ModelEvaluator; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultSaver; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.CandidateStatus; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; -import org.deeplearning4j.arbiter.scoring.util.ScoreUtil; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.earlystopping.EarlyStoppingResult; -import org.deeplearning4j.earlystopping.trainer.EarlyStoppingGraphTrainer; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; -import org.nd4j.linalg.factory.Nd4j; - -import java.io.IOException; -import java.util.List; -import java.util.Properties; -import java.util.concurrent.Callable; - -/** - * Task creator for ComputationGraph - * - * @author Alex Black - */ -@AllArgsConstructor -@NoArgsConstructor -@Slf4j -public class ComputationGraphTaskCreator implements TaskCreator { - - private ModelEvaluator modelEvaluator; - @Getter - @Setter - private TaskListener taskListener; - - public ComputationGraphTaskCreator(ModelEvaluator modelEvaluator){ - this(modelEvaluator, null); - } - - @Override - public Callable create(Candidate candidate, DataProvider dataProvider, - ScoreFunction scoreFunction, List statusListener, - IOptimizationRunner runner) { - - return new GraphLearningTask(candidate, dataProvider, scoreFunction, modelEvaluator, statusListener, - taskListener, runner); - } - - @Override - public Callable create(Candidate candidate, Class dataSource, Properties dataSourceProperties, - ScoreFunction scoreFunction, List statusListeners, IOptimizationRunner runner) { - return new GraphLearningTask(candidate, dataSource, dataSourceProperties, scoreFunction, modelEvaluator, statusListeners, - taskListener, runner); - } - - @AllArgsConstructor - private static class GraphLearningTask implements Callable { - - private Candidate candidate; - private DataProvider dataProvider; - private Class dataSource; - private Properties dataSourceProperties; - private ScoreFunction scoreFunction; - private ModelEvaluator modelEvaluator; - private List listeners; - private TaskListener taskListener; - private IOptimizationRunner runner; - - private long startTime; - - public GraphLearningTask(Candidate candidate, DataProvider dataProvider, ScoreFunction scoreFunction, - ModelEvaluator modelEvaluator, List listeners, - TaskListener taskListener, IOptimizationRunner runner) { - this.candidate = candidate; - this.dataProvider = dataProvider; - this.scoreFunction = scoreFunction; - this.modelEvaluator = modelEvaluator; - this.listeners = listeners; - this.taskListener = taskListener; - this.runner = runner; - } - - public GraphLearningTask(Candidate candidate, Class dataSource, Properties dataSourceProperties, - ScoreFunction scoreFunction, ModelEvaluator modelEvaluator, List listeners, - TaskListener taskListener, IOptimizationRunner runner) { - this.candidate = candidate; - this.dataSource = dataSource; - this.dataSourceProperties = dataSourceProperties; - this.scoreFunction = scoreFunction; - this.modelEvaluator = modelEvaluator; - this.listeners = listeners; - this.taskListener = taskListener; - this.runner = runner; - } - - - @Override - public OptimizationResult call() throws Exception { - - try { - OptimizationResult result = callHelper(); - if(listeners != null && !listeners.isEmpty()){ - CandidateInfo ci = new CandidateInfo(candidate.getIndex(), CandidateStatus.Complete, result.getScore(), - startTime, startTime, System.currentTimeMillis(), candidate.getFlatParameters(), null); - for(StatusListener sl : listeners){ - try{ - sl.onCandidateStatusChange(ci, runner, result); - } catch (Exception e){ - log.error("Error in status listener for candidate {}", candidate.getIndex(), e); - } - } - } - return result; - } catch (Throwable e) { - String stackTrace = ExceptionUtils.getStackTrace(e); - log.warn("Execution failed for task {}", candidate.getIndex(), e); - - CandidateInfo ci = new CandidateInfo(candidate.getIndex(), CandidateStatus.Failed, null, startTime, - null, null, candidate.getFlatParameters(), stackTrace); - return new OptimizationResult(candidate, null, candidate.getIndex(), null, ci, null); - } finally { - //Destroy workspaces to free memory - Nd4j.getWorkspaceManager().destroyAllWorkspacesForCurrentThread(); - System.gc(); - try { - //Sleep for a few seconds - workspace destruction and memory deallocation happens quickly but doesn't - // happen instantly; if we didn't have this, we may run into a situation where the next thread/task - // tries to allocate before WS memory is fully deallocated, resulting in an OOM in memory constrained - // environments - Thread.sleep(2000L); - } catch (Exception e){ } - } - } - - private OptimizationResult callHelper() throws Exception { - startTime = System.currentTimeMillis(); - CandidateInfo ci = new CandidateInfo(candidate.getIndex(), CandidateStatus.Running, null, startTime, startTime, - null, candidate.getFlatParameters(), null); - - //Create network - ComputationGraph net = new ComputationGraph(((GraphConfiguration) candidate.getValue()).getConfiguration()); - net.init(); - - if(taskListener != null){ - net = taskListener.preProcess(net, candidate); - } - - if (listeners != null) { - net.addListeners(new DL4JArbiterStatusReportingListener(listeners, ci)); - } - - //For DataSetIterator: wraps in a MultiDataSetIterator, hence method can be used for both - MultiDataSetIterator iterator; - if(dataSource != null){ - try { - DataSource dsInstance = dataSource.newInstance(); - if (dataSourceProperties != null) - dsInstance.configure(dataSourceProperties); - iterator = ScoreUtil.getMultiIterator(dsInstance.trainData()); - } catch (Exception e){ - throw new RuntimeException("Error instantiating instance of DataSource for class " + dataSource.getName() + - " - no zero-arg constructor?",e); - } - } else { - iterator = ScoreUtil.getMultiIterator(dataProvider.trainData(candidate.getDataParameters())); - } - - - EarlyStoppingConfiguration esConfig = - ((GraphConfiguration) candidate.getValue()).getEarlyStoppingConfiguration(); - EarlyStoppingResult esResult = null; - if (esConfig != null) { - EarlyStoppingGraphTrainer trainer = new EarlyStoppingGraphTrainer(esConfig, net, iterator, null); - esResult = trainer.fit(); - net = esResult.getBestModel(); //Can return null if failed OR if - - switch (esResult.getTerminationReason()) { - case Error: - ci.setCandidateStatus(CandidateStatus.Failed); - ci.setExceptionStackTrace(esResult.getTerminationDetails()); - break; - case IterationTerminationCondition: - case EpochTerminationCondition: - ci.setCandidateStatus(CandidateStatus.Complete); - break; - } - - } else { - //Fixed number of epochs - int nEpochs = ((GraphConfiguration) candidate.getValue()).getNumEpochs(); - for (int i = 0; i < nEpochs; i++) { - net.fit(iterator); - } - ci.setCandidateStatus(CandidateStatus.Complete); - } - Nd4j.getExecutioner().commit(); - - Object additionalEvaluation = null; - if (esConfig != null && esResult.getTerminationReason() != EarlyStoppingResult.TerminationReason.Error) { - additionalEvaluation = - (modelEvaluator != null ? modelEvaluator.evaluateModel(net, dataProvider) : null); - } - - Double score = null; - if (net != null) { - if(dataSource != null){ - score = scoreFunction.score(net, dataSource, dataSourceProperties); - } else { - score = scoreFunction.score(net, dataProvider, candidate.getDataParameters()); - } - ci.setScore(score); - } - - if(taskListener != null){ - taskListener.postProcess(net, candidate); - } - - OptimizationResult result = new OptimizationResult(candidate, score, candidate.getIndex(), additionalEvaluation, ci, null); - - //Save the model: - ResultSaver saver = runner.getConfiguration().getResultSaver(); - ResultReference resultReference = null; - if (saver != null) { - try { - resultReference = saver.saveModel(result, net); - } catch (IOException e) { - //TODO: Do we want ta warn or fail on IOException? - log.warn("Error saving model (id={}): IOException thrown. ", result.getIndex(), e); - } - } - result.setResultReference(resultReference); - return result; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/MultiLayerNetworkTaskCreator.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/MultiLayerNetworkTaskCreator.java deleted file mode 100644 index 79e65946e..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/MultiLayerNetworkTaskCreator.java +++ /dev/null @@ -1,267 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.task; - -import lombok.AllArgsConstructor; -import lombok.Getter; -import lombok.NoArgsConstructor; -import lombok.Setter; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang3.exception.ExceptionUtils; -import org.deeplearning4j.arbiter.DL4JConfiguration; -import org.deeplearning4j.arbiter.listener.DL4JArbiterStatusReportingListener; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.api.TaskCreator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.evaluation.ModelEvaluator; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultSaver; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.CandidateStatus; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; -import org.deeplearning4j.arbiter.scoring.util.ScoreUtil; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.earlystopping.EarlyStoppingResult; -import org.deeplearning4j.earlystopping.trainer.EarlyStoppingTrainer; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.factory.Nd4j; - -import java.io.IOException; -import java.util.List; -import java.util.Properties; -import java.util.concurrent.Callable; - -/** - * Task creator for MultiLayerNetworks - * - * @author Alex Black - */ -@AllArgsConstructor -@NoArgsConstructor -@Slf4j -public class MultiLayerNetworkTaskCreator implements TaskCreator { - - private ModelEvaluator modelEvaluator; - @Getter - @Setter - private TaskListener taskListener; - - public MultiLayerNetworkTaskCreator(ModelEvaluator modelEvaluator){ - this(modelEvaluator, null); - } - - @Override - public Callable create(Candidate candidate, DataProvider dataProvider, - ScoreFunction scoreFunction, List statusListeners, - IOptimizationRunner runner) { - - return new DL4JLearningTask(candidate, dataProvider, scoreFunction, modelEvaluator, statusListeners, taskListener, runner); - } - - @Override - public Callable create(Candidate candidate, Class dataSource, Properties dataSourceProperties, - ScoreFunction scoreFunction, List statusListeners, IOptimizationRunner runner) { - return new DL4JLearningTask(candidate, dataSource, dataSourceProperties, scoreFunction, modelEvaluator, statusListeners, taskListener, runner); - } - - - private static class DL4JLearningTask implements Callable { - - private Candidate candidate; - private DataProvider dataProvider; - private Class dataSource; - private Properties dataSourceProperties; - private ScoreFunction scoreFunction; - private ModelEvaluator modelEvaluator; - private List listeners; - private TaskListener taskListener; - private IOptimizationRunner runner; - - private long startTime; - - public DL4JLearningTask(Candidate candidate, DataProvider dataProvider, ScoreFunction scoreFunction, - ModelEvaluator modelEvaluator, List listeners, TaskListener taskListener, - IOptimizationRunner runner) { - this.candidate = candidate; - this.dataProvider = dataProvider; - this.scoreFunction = scoreFunction; - this.modelEvaluator = modelEvaluator; - this.listeners = listeners; - this.taskListener = taskListener; - this.runner = runner; - } - - public DL4JLearningTask(Candidate candidate, Class dataSource, Properties dataSourceProperties, - ScoreFunction scoreFunction, ModelEvaluator modelEvaluator, List listeners, TaskListener taskListener, - IOptimizationRunner runner) { - this.candidate = candidate; - this.dataSource = dataSource; - this.dataSourceProperties = dataSourceProperties; - this.scoreFunction = scoreFunction; - this.modelEvaluator = modelEvaluator; - this.listeners = listeners; - this.taskListener = taskListener; - this.runner = runner; - } - - - @Override - public OptimizationResult call() { - - try { - OptimizationResult result = callHelper(); - if(listeners != null && !listeners.isEmpty()){ - CandidateInfo ci = new CandidateInfo(candidate.getIndex(), CandidateStatus.Complete, result.getScore(), - startTime, startTime, System.currentTimeMillis(), candidate.getFlatParameters(), null); - for(StatusListener sl : listeners){ - try{ - sl.onCandidateStatusChange(ci, runner, result); - } catch (Exception e){ - log.error("Error in status listener for candidate {}", candidate.getIndex(), e); - } - } - } - return result; - } catch (Throwable e) { - String stackTrace = ExceptionUtils.getStackTrace(e); - log.warn( "Execution failed for task {}", candidate.getIndex(), e ); - - CandidateInfo ci = new CandidateInfo(candidate.getIndex(), CandidateStatus.Failed, null, startTime, - null, null, candidate.getFlatParameters(), stackTrace); - return new OptimizationResult(candidate, null, candidate.getIndex(), null, ci, null); - } finally { - //Destroy workspaces to free memory - Nd4j.getWorkspaceManager().destroyAllWorkspacesForCurrentThread(); - System.gc(); - try { - //Sleep for a few seconds - workspace destruction and memory deallocation happens quickly but doesn't - // happen instantly; if we didn't have this, we may run into a situation where the next thread/task - // tries to allocate before WS memory is fully deallocated, resulting in an OOM in memory constrained - // environments - Thread.sleep(2000L); - } catch (Exception e){ } - } - } - - private OptimizationResult callHelper() { - startTime = System.currentTimeMillis(); - CandidateInfo ci = new CandidateInfo(candidate.getIndex(), CandidateStatus.Running, null, - startTime, startTime, null, candidate.getFlatParameters(), null); - - //Create network - MultiLayerNetwork net = new MultiLayerNetwork( - ((DL4JConfiguration) candidate.getValue()).getMultiLayerConfiguration()); - net.init(); - - if(taskListener != null){ - net = taskListener.preProcess(net, candidate); - } - - if (listeners != null) { - net.addListeners(new DL4JArbiterStatusReportingListener(listeners, ci)); - } - - //Early stopping or fixed number of epochs: - DataSetIterator dataSetIterator; - if(dataSource != null){ - DataSource dsInstance; - try{ - dsInstance = dataSource.newInstance(); - } catch (Exception e){ - throw new RuntimeException("Error instantiating instance of DataSource for class " + dataSource.getName() + - " - no zero-arg constructor?",e); - } - if(dataSourceProperties != null) - dsInstance.configure(dataSourceProperties); - dataSetIterator = ScoreUtil.getIterator(dsInstance.trainData()); - } else { - dataSetIterator = ScoreUtil.getIterator(dataProvider.trainData(candidate.getDataParameters())); - } - - - EarlyStoppingConfiguration esConfig = - ((DL4JConfiguration) candidate.getValue()).getEarlyStoppingConfiguration(); - EarlyStoppingResult esResult = null; - if (esConfig != null) { - EarlyStoppingTrainer trainer = new EarlyStoppingTrainer(esConfig, net, dataSetIterator, null); - esResult = trainer.fit(); - net = esResult.getBestModel(); //Can return null if failed OR if - - switch (esResult.getTerminationReason()) { - case Error: - ci.setCandidateStatus(CandidateStatus.Failed); - ci.setExceptionStackTrace(esResult.getTerminationDetails()); - break; - case IterationTerminationCondition: - case EpochTerminationCondition: - ci.setCandidateStatus(CandidateStatus.Complete); - break; - } - - } else { - //Fixed number of epochs - int nEpochs = ((DL4JConfiguration) candidate.getValue()).getNumEpochs(); - for (int i = 0; i < nEpochs; i++) { - net.fit(dataSetIterator); - } - ci.setCandidateStatus(CandidateStatus.Complete); - } - - Object additionalEvaluation = null; - if (esConfig != null && esResult.getTerminationReason() != EarlyStoppingResult.TerminationReason.Error) { - additionalEvaluation = - (modelEvaluator != null ? modelEvaluator.evaluateModel(net, dataProvider) : null); - } - - Double score = null; - if (net != null) { - if(dataSource != null){ - score = scoreFunction.score(net, dataSource, dataSourceProperties); - } else { - score = scoreFunction.score(net, dataProvider, candidate.getDataParameters()); - } - ci.setScore(score); - } - - if(taskListener != null){ - taskListener.postProcess(net, candidate); - } - - OptimizationResult result = new OptimizationResult(candidate, score, candidate.getIndex(), additionalEvaluation, ci, null); - //Save the model: - ResultSaver saver = runner.getConfiguration().getResultSaver(); - ResultReference resultReference = null; - if (saver != null) { - try { - resultReference = saver.saveModel(result, net); - } catch (IOException e) { - //TODO: Do we want ta warn or fail on IOException? - log.warn("Error saving model (id={}): IOException thrown. ", result.getIndex(), e); - } - } - result.setResultReference(resultReference); - return result; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/TaskListener.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/TaskListener.java deleted file mode 100644 index 0b4886d6f..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/main/java/org/deeplearning4j/arbiter/task/TaskListener.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.task; - -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.nn.api.Model; - -import java.io.Serializable; - -/** - * TaskListener: can be used to preprocess and post process a model (MultiLayerNetwork or ComputationGraph) before/after - * training, in a {@link MultiLayerNetworkTaskCreator} or {@link ComputationGraphTaskCreator} - * - * @author Alex Black - */ -public interface TaskListener extends Serializable { - - /** - * Preprocess the model, before any training has taken place. - *
- * Can be used to (for example) set listeners on a model before training starts - * @param model Model to preprocess - * @param candidate Candidate information, for the current model - * @return The updated model (usually the same one as the input, perhaps with modifications) - */ - T preProcess(T model, Candidate candidate); - - /** - * Post process the model, after any training has taken place - * @param model Model to postprocess - * @param candidate Candidate information, for the current model - */ - void postProcess(Model model, Candidate candidate); - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/AssertTestsExtendBaseClass.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/AssertTestsExtendBaseClass.java deleted file mode 100644 index 3a13e38da..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/AssertTestsExtendBaseClass.java +++ /dev/null @@ -1,52 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.arbiter; - -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.BaseDL4JTest; -import org.nd4j.common.tests.AbstractAssertTestsClass; - -import java.util.*; - -/** - * This class checks that all test classes (i.e., anything with one or more methods annotated with @Test) - * extends BaseDl4jTest - either directly or indirectly. - * Other than a small set of exceptions, all tests must extend this - * - * @author Alex Black - */ - -@Slf4j -public class AssertTestsExtendBaseClass extends AbstractAssertTestsClass { - - @Override - protected Set> getExclusions() { - //Set of classes that are exclusions to the rule (either run manually or have their own logging + timeouts) - return new HashSet<>(); - } - - @Override - protected String getPackageName() { - return "org.deeplearning4j.arbiter"; - } - - @Override - protected Class getBaseClass() { - return BaseDL4JTest.class; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/TestUtils.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/TestUtils.java deleted file mode 100644 index 0596d7c1b..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/TestUtils.java +++ /dev/null @@ -1,245 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter; - -import org.apache.commons.compress.utils.IOUtils; -import org.deeplearning4j.nn.conf.ComputationGraphConfiguration; -import org.deeplearning4j.nn.conf.MultiLayerConfiguration; -import org.deeplearning4j.nn.conf.layers.BaseLayer; -import org.deeplearning4j.nn.conf.layers.samediff.AbstractSameDiffLayer; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.deeplearning4j.util.ModelSerializer; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.api.ops.random.impl.BernoulliDistribution; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.linalg.learning.regularization.L1Regularization; -import org.nd4j.linalg.learning.regularization.L2Regularization; -import org.nd4j.linalg.learning.regularization.Regularization; -import org.nd4j.linalg.learning.regularization.WeightDecay; - -import java.io.*; -import java.util.List; -import java.util.Random; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertNotNull; - -public class TestUtils { - - public static MultiLayerNetwork testModelSerialization(MultiLayerNetwork net){ - - MultiLayerNetwork restored; - try { - ByteArrayOutputStream baos = new ByteArrayOutputStream(); - ModelSerializer.writeModel(net, baos, true); - byte[] bytes = baos.toByteArray(); - - ByteArrayInputStream bais = new ByteArrayInputStream(bytes); - restored = ModelSerializer.restoreMultiLayerNetwork(bais, true); - - assertEquals(net.getLayerWiseConfigurations(), restored.getLayerWiseConfigurations()); - assertEquals(net.params(), restored.params()); - } catch (IOException e){ - //Should never happen - throw new RuntimeException(e); - } - - //Also check the MultiLayerConfiguration is serializable (required by Spark etc) - MultiLayerConfiguration conf = net.getLayerWiseConfigurations(); - serializeDeserializeJava(conf); - - return restored; - } - - public static ComputationGraph testModelSerialization(ComputationGraph net){ - - ComputationGraph restored; - try { - ByteArrayOutputStream baos = new ByteArrayOutputStream(); - ModelSerializer.writeModel(net, baos, true); - byte[] bytes = baos.toByteArray(); - - ByteArrayInputStream bais = new ByteArrayInputStream(bytes); - restored = ModelSerializer.restoreComputationGraph(bais, true); - - assertEquals(net.getConfiguration(), restored.getConfiguration()); - assertEquals(net.params(), restored.params()); - } catch (IOException e){ - //Should never happen - throw new RuntimeException(e); - } - - //Also check the ComputationGraphConfiguration is serializable (required by Spark etc) - ComputationGraphConfiguration conf = net.getConfiguration(); - serializeDeserializeJava(conf); - - return restored; - } - - private static T serializeDeserializeJava(T object){ - byte[] bytes; - try(ByteArrayOutputStream baos = new ByteArrayOutputStream(); ObjectOutputStream oos = new ObjectOutputStream(baos)){ - oos.writeObject(object); - oos.close(); - bytes = baos.toByteArray(); - } catch (IOException e){ - //Should never happen - throw new RuntimeException(e); - } - - T out; - try(ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(bytes))){ - out = (T)ois.readObject(); - } catch (IOException | ClassNotFoundException e){ - throw new RuntimeException(e); - } - - assertEquals(object, out); - return out; - } - - public static INDArray randomOneHot(long examples, long nOut){ - return randomOneHot(examples, nOut, new Random(12345)); - } - - public static INDArray randomOneHot(long examples, long nOut, long rngSeed){ - return randomOneHot(examples, nOut, new Random(rngSeed)); - } - - public static INDArray randomOneHot(long examples, long nOut, Random rng){ - INDArray arr = Nd4j.create(examples, nOut); - for( int i=0; i l){ - for(Regularization r : l){ - if(r instanceof L1Regularization){ - return (L1Regularization) r; - } - } - return null; - } - - public static L2Regularization getL2Reg(BaseLayer baseLayer){ - return getL2Reg(baseLayer.getRegularization()); - } - - public static L2Regularization getL2Reg(List l){ - for(Regularization r : l){ - if(r instanceof L2Regularization){ - return (L2Regularization) r; - } - } - return null; - } - - public static WeightDecay getWeightDecayReg(BaseLayer bl){ - return getWeightDecayReg(bl.getRegularization()); - } - - public static WeightDecay getWeightDecayReg(List l){ - for(Regularization r : l){ - if(r instanceof WeightDecay){ - return (WeightDecay) r; - } - } - return null; - } - - public static double getL1(BaseLayer layer) { - List l = layer.getRegularization(); - return getL1(l); - } - - public static double getL1(List l){ - L1Regularization l1Reg = null; - for(Regularization reg : l){ - if(reg instanceof L1Regularization) - l1Reg = (L1Regularization) reg; - } - assertNotNull(l1Reg); - return l1Reg.getL1().valueAt(0,0); - } - - public static double getL2(BaseLayer layer) { - List l = layer.getRegularization(); - return getL2(l); - } - - public static double getL2(List l){ - L2Regularization l2Reg = null; - for(Regularization reg : l){ - if(reg instanceof L2Regularization) - l2Reg = (L2Regularization) reg; - } - assertNotNull(l2Reg); - return l2Reg.getL2().valueAt(0,0); - } - - public static double getL1(AbstractSameDiffLayer layer){ - return getL1(layer.getRegularization()); - } - - public static double getL2(AbstractSameDiffLayer layer){ - return getL2(layer.getRegularization()); - } - - public static double getWeightDecay(BaseLayer layer) { - return getWeightDecayReg(layer.getRegularization()).getCoeff().valueAt(0,0); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestComputationGraphSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestComputationGraphSpace.java deleted file mode 100644 index 4101a31eb..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestComputationGraphSpace.java +++ /dev/null @@ -1,170 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.computationgraph; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.ComputationGraphSpace; -import org.deeplearning4j.arbiter.TestUtils; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.nn.conf.ComputationGraphConfiguration; -import org.deeplearning4j.nn.conf.NeuralNetConfiguration; -import org.deeplearning4j.nn.conf.graph.LayerVertex; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.deeplearning4j.nn.conf.layers.BaseLayer; -import org.deeplearning4j.nn.conf.layers.DenseLayer; -import org.deeplearning4j.nn.conf.layers.OutputLayer; -import org.junit.Test; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; -import org.nd4j.linalg.learning.config.Sgd; -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction; - -import java.util.List; -import java.util.Random; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertTrue; - -public class TestComputationGraphSpace extends BaseDL4JTest { - - @Test - public void testBasic() { - - ComputationGraphConfiguration expected = new NeuralNetConfiguration.Builder() - .updater(new Sgd(0.005)) - .seed(12345) - .graphBuilder().addInputs("in") - .addLayer("0", new DenseLayer.Builder().nIn(10).nOut(10).build(), "in") - .addLayer("1", new DenseLayer.Builder().nIn(10).nOut(10).build(), "0").addLayer("2", - new OutputLayer.Builder().lossFunction(LossFunction.MCXENT) - .activation(Activation.SOFTMAX) - .nIn(10).nOut(5) - .build(), - "1") - .setOutputs("2").build(); - - ComputationGraphSpace cgs = new ComputationGraphSpace.Builder() - .updater(new Sgd(0.005)) - .seed(12345).addInputs("in") - .addLayer("0", new DenseLayerSpace.Builder().nIn(10).nOut(10).build(), "in") - .addLayer("1", new DenseLayerSpace.Builder().nIn(10).nOut(10).build(), "0") - .addLayer("2", new OutputLayerSpace.Builder().lossFunction(LossFunction.MCXENT).activation(Activation.SOFTMAX).nIn(10).nOut(5) - .build(), "1") - .setOutputs("2").setInputTypes(InputType.feedForward(10)) - .build(); - - int nParams = cgs.numParameters(); - assertEquals(0, nParams); - - ComputationGraphConfiguration conf = cgs.getValue(new double[0]).getConfiguration(); - - assertEquals(expected, conf); - } - - @Test - public void testBasic2() { - - ComputationGraphSpace mls = new ComputationGraphSpace.Builder() - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.2, 0.5)) - .addInputs("in").addLayer("0", - new DenseLayerSpace.Builder().nIn(10).nOut(10) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), - "in") - .addLayer("1", new OutputLayerSpace.Builder().nIn(10).nOut(10).activation(Activation.SOFTMAX) - .build(), "0") - .setOutputs("1").setInputTypes(InputType.feedForward(10)).build(); - - int nParams = mls.numParameters(); - assertEquals(3, nParams); - - //Assign numbers to each leaf ParameterSpace object (normally done by candidate generator) - List noDuplicatesList = LeafUtils.getUniqueObjects(mls.collectLeaves()); - - //Second: assign each a number - int c = 0; - for (ParameterSpace ps : noDuplicatesList) { - int np = ps.numParameters(); - if (np == 1) { - ps.setIndices(c++); - } else { - int[] values = new int[np]; - for (int j = 0; j < np; j++) - values[c++] = j; - ps.setIndices(values); - } - } - - int reluCount = 0; - int tanhCount = 0; - - Random r = new Random(12345); - - for (int i = 0; i < 50; i++) { - - double[] rvs = new double[nParams]; - for (int j = 0; j < rvs.length; j++) - rvs[j] = r.nextDouble(); - - - ComputationGraphConfiguration conf = mls.getValue(rvs).getConfiguration(); - - int nLayers = conf.getVertexInputs().size(); - assertEquals(2, nLayers); - - for (int j = 0; j < nLayers; j++) { - NeuralNetConfiguration layerConf = - ((LayerVertex) conf.getVertices().get(String.valueOf(j))).getLayerConf(); - - double lr = ((Sgd)((BaseLayer) layerConf.getLayer()).getIUpdater()).getLearningRate(); - assertTrue(lr >= 0.0001 && lr <= 0.1); - double l2 = TestUtils.getL2(((BaseLayer) layerConf.getLayer())); - assertTrue(l2 >= 0.2 && l2 <= 0.5); - - if (j == nLayers - 1) { //Output layer - assertEquals(Activation.SOFTMAX.getActivationFunction(), - ((BaseLayer) layerConf.getLayer()).getActivationFn()); - } else { - IActivation actFn = ((BaseLayer) layerConf.getLayer()).getActivationFn(); - assertTrue(Activation.RELU.getActivationFunction().equals(actFn) || - Activation.TANH.getActivationFunction().equals(actFn)); - if (Activation.RELU.getActivationFunction().equals(actFn)) - reluCount++; - else - tanhCount++; - } - } - } - -// System.out.println("ReLU vs. Tanh: " + reluCount + "\t" + tanhCount); - assertTrue(reluCount > 0); - assertTrue(tanhCount > 0); - - } - - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestGraphLocalExecution.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestGraphLocalExecution.java deleted file mode 100644 index dbb2e61ce..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestGraphLocalExecution.java +++ /dev/null @@ -1,376 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.computationgraph; - -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.ComputationGraphSpace; -import org.deeplearning4j.arbiter.conf.updater.AdamSpace; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.evaluator.multilayer.ClassificationEvaluator; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.multilayernetwork.TestDL4JLocalExecution; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxTimeCondition; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.ScoreFunctions; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.arbiter.task.ComputationGraphTaskCreator; -import org.deeplearning4j.arbiter.util.TestDataFactoryProviderMnist; -import org.deeplearning4j.datasets.iterator.MultipleEpochsIterator; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.earlystopping.saver.InMemoryModelSaver; -import org.deeplearning4j.earlystopping.scorecalc.DataSetLossCalculatorCG; -import org.deeplearning4j.earlystopping.scorecalc.ScoreCalculator; -import org.deeplearning4j.earlystopping.termination.MaxEpochsTerminationCondition; -import org.deeplearning4j.nn.api.OptimizationAlgorithm; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.junit.BeforeClass; -import org.junit.Rule; -import org.junit.Test; -import org.junit.rules.TemporaryFolder; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.dataset.adapter.MultiDataSetIteratorAdapter; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.common.function.Supplier; -import org.nd4j.linalg.lossfunctions.LossFunctions; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.io.File; -import java.io.IOException; -import java.io.Serializable; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Properties; -import java.util.concurrent.TimeUnit; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertTrue; - -@Slf4j -public class TestGraphLocalExecution extends BaseDL4JTest { - - @Rule - public TemporaryFolder testDir = new TemporaryFolder(); - - @BeforeClass - public static void before(){ - Nd4j.setDefaultDataTypes(DataType.FLOAT, DataType.FLOAT); - } - - @Override - public long getTimeoutMilliseconds() { - return 120_000L; - } - - @Test - public void testLocalExecutionDataSources() throws Exception { - - for( int dataApproach = 0; dataApproach<3; dataApproach++ ) { - log.info("////////////////// Starting Test: {} ///////////////////", dataApproach); - - //Define: network config (hyperparameter space) - ComputationGraphSpace mls = new ComputationGraphSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)) - .addInputs("in") - .addLayer("0", - new DenseLayerSpace.Builder().nIn(784).nOut(new IntegerParameterSpace(10, 20)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), "in") //1-2 identical layers (except nIn) - .addLayer("1", new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "0") - .setOutputs("1") - .setInputTypes(InputType.feedForward(784)) - .numEpochs(3).build(); - - DataProvider dp = null; - Class ds = null; - Properties dsP = null; - CandidateGenerator candidateGenerator; - - if(dataApproach == 0){ - ds = TestDL4JLocalExecution.MnistDataSource.class; - dsP = new Properties(); - dsP.setProperty("minibatch", "2"); - candidateGenerator = new RandomSearchGenerator(mls); - } else if(dataApproach == 1) { - //DataProvider approach - dp = new TestDL4JLocalExecution.MnistDataProvider(); - candidateGenerator = new RandomSearchGenerator(mls); - } else { - //Factory approach - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - candidateGenerator = new RandomSearchGenerator(mls, commands); - dp = new DataSetIteratorFactoryProvider(); - } - - File f = testDir.newFolder(); - File modelSave = new File(f, "modelSaveDir"); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dp) - .dataSource(ds, dsP) - .modelSaver(new FileModelSaver(modelSave)) - .scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(20, TimeUnit.SECONDS), - new MaxCandidatesCondition(3)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration,new ComputationGraphTaskCreator(new ClassificationEvaluator())); - - runner.execute(); - - List results = runner.getResults(); - assertTrue(results.size() > 0); - -// System.out.println("----- COMPLETE - " + results.size() + " results -----"); - } - } - - - @Test - public void testLocalExecution() throws Exception { - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - //Define: network config (hyperparameter space) - ComputationGraphSpace mls = new ComputationGraphSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)).addInputs("in") - .setInputTypes(InputType.feedForward(4)) - .addLayer("layer0", - new DenseLayerSpace.Builder().nIn(784).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), - "in") - .addLayer("out", new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "layer0") - .setOutputs("out").numEpochs(3).build(); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, commands); - DataProvider dataProvider = new DataSetIteratorFactoryProvider(); - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterDL4JTest\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - f.deleteOnExit(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)).scoreFunction(ScoreFunctions.testSetLoss(true)) - .terminationConditions(new MaxTimeCondition(30, TimeUnit.SECONDS), - new MaxCandidatesCondition(3)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, - new ComputationGraphTaskCreator(new ClassificationEvaluator())); - - runner.execute(); - - assertEquals(0, runner.numCandidatesFailed()); - assertTrue(runner.numCandidatesCompleted() > 0); - } - - @Test - public void testLocalExecutionMDS() throws Exception { - //Define: network config (hyperparameter space) - ComputationGraphSpace mls = new ComputationGraphSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)).addInputs("in") - .setInputTypes(InputType.feedForward(784)) - .addLayer("layer0", - new DenseLayerSpace.Builder().nIn(784).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), - "in") - .addLayer("out", new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "layer0") - .setOutputs("out").numEpochs(3).build(); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, null); - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterDL4JTest\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - f.deleteOnExit(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(new TestMdsDataProvider(1, 32)) - .modelSaver(new FileModelSaver(modelSavePath)).scoreFunction(ScoreFunctions.testSetLoss(true)) - .terminationConditions(new MaxTimeCondition(30, TimeUnit.SECONDS), - new MaxCandidatesCondition(3)) - .scoreFunction(ScoreFunctions.testSetAccuracy()) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new ComputationGraphTaskCreator()); - - runner.execute(); - - assertEquals(0, runner.numCandidatesFailed()); - assertTrue(runner.numCandidatesCompleted() > 0); - } - - public static class TestMdsDataProvider implements DataProvider { - private int numEpochs; - private int batchSize; - - public TestMdsDataProvider(@JsonProperty("numEpochs") int numEpochs, @JsonProperty("batchSize") int batchSize) { - this.numEpochs = numEpochs; - this.batchSize = batchSize; - } - - private TestMdsDataProvider() { - } - - - @Override - public Object trainData(Map dataParameters) { - try { - DataSetIterator underlying = new MnistDataSetIterator(batchSize, Math.min(60000, 3 * batchSize), false, true, true, 12345); - return new MultiDataSetIteratorAdapter(new MultipleEpochsIterator(numEpochs, underlying)); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - @Override - public Object testData(Map dataParameters) { - try { - DataSetIterator underlying = new MnistDataSetIterator(batchSize, Math.min(10000, 2 * batchSize), false, false, false, 12345); - return new MultiDataSetIteratorAdapter(underlying); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return MultiDataSetIterator.class; - } - } - - @Test - public void testLocalExecutionEarlyStopping() throws Exception { - EarlyStoppingConfiguration esConf = new EarlyStoppingConfiguration.Builder() - .epochTerminationConditions(new MaxEpochsTerminationCondition(2)) - .scoreCalculator(new ScoreProvider()) - .modelSaver(new InMemoryModelSaver()).build(); - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - //Define: network config (hyperparameter space) - ComputationGraphSpace cgs = new ComputationGraphSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new AdamSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)).addInputs("in") - .setInputTypes(InputType.feedForward(784)) - .addLayer("first", - new DenseLayerSpace.Builder().nIn(784).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), - "in") //1-2 identical layers (except nIn) - .addLayer("out", new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "first") - .setOutputs("out").earlyStoppingConfiguration(esConf).build(); - - //Define configuration: - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(cgs, commands); - DataProvider dataProvider = new DataSetIteratorFactoryProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterDL4JTest2CG\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - f.deleteOnExit(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dataProvider) - .scoreFunction(ScoreFunctions.testSetF1()) - .modelSaver(new FileModelSaver(modelSavePath)) - .terminationConditions(new MaxTimeCondition(15, TimeUnit.SECONDS), - new MaxCandidatesCondition(3)) - .build(); - - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new ComputationGraphTaskCreator()); - runner.execute(); - - assertEquals(0, runner.numCandidatesFailed()); - assertTrue(runner.numCandidatesCompleted() > 0); - } - - private static class ScoreProvider implements Supplier, Serializable { - @Override - public ScoreCalculator get() { - try { - return new DataSetLossCalculatorCG(new MnistDataSetIterator(4, 8), true); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestGraphLocalExecutionGenetic.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestGraphLocalExecutionGenetic.java deleted file mode 100644 index 4b853db55..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/computationgraph/TestGraphLocalExecutionGenetic.java +++ /dev/null @@ -1,215 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.computationgraph; - -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.ComputationGraphSpace; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.evaluator.multilayer.ClassificationEvaluator; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.multilayernetwork.TestDL4JLocalExecution; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxTimeCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.generator.GeneticSearchCandidateGenerator; -import org.deeplearning4j.arbiter.optimize.generator.genetic.population.PopulationModel; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.arbiter.task.ComputationGraphTaskCreator; -import org.deeplearning4j.arbiter.util.TestDataFactoryProviderMnist; -import org.deeplearning4j.datasets.iterator.MultipleEpochsIterator; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.deeplearning4j.earlystopping.scorecalc.DataSetLossCalculatorCG; -import org.deeplearning4j.earlystopping.scorecalc.ScoreCalculator; -import org.deeplearning4j.nn.api.OptimizationAlgorithm; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.junit.Rule; -import org.junit.Test; -import org.junit.rules.TemporaryFolder; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.dataset.adapter.MultiDataSetIteratorAdapter; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.MultiDataSetIterator; -import org.nd4j.common.function.Supplier; -import org.nd4j.linalg.lossfunctions.LossFunctions; -import org.nd4j.shade.jackson.annotation.JsonProperty; - -import java.io.File; -import java.io.IOException; -import java.io.Serializable; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Properties; -import java.util.concurrent.TimeUnit; - -import static org.junit.Assert.assertTrue; - -@Slf4j -public class TestGraphLocalExecutionGenetic extends BaseDL4JTest { - - @Rule - public TemporaryFolder testDir = new TemporaryFolder(); - - @Override - public long getTimeoutMilliseconds() { - return 120_000L; - } - - @Test - public void testLocalExecutionDataSources() throws Exception { - for (int dataApproach = 0; dataApproach < 3; dataApproach++) { - log.info("////////////////// Starting Test: {} ///////////////////", dataApproach); - - //Define: network config (hyperparameter space) - ComputationGraphSpace mls = new ComputationGraphSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)) - .addInputs("in") - .addLayer("0", - new DenseLayerSpace.Builder().nIn(784).nOut(new IntegerParameterSpace(5, 32)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH, Activation.LEAKYRELU)) - .build(), "in") //1-2 identical layers (except nIn) - .addLayer("1", new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "0") - .setOutputs("1") - .setInputTypes(InputType.feedForward(784)) - .numEpochs(3).build(); - - DataProvider dp = null; - Class ds = null; - Properties dsP = null; - CandidateGenerator candidateGenerator; - - TestSetLossScoreFunction scoreFunction = new TestSetLossScoreFunction(); - - if (dataApproach == 0) { - ds = TestDL4JLocalExecution.MnistDataSource.class; - dsP = new Properties(); - dsP.setProperty("minibatch", "2"); - - candidateGenerator = new GeneticSearchCandidateGenerator.Builder(mls, scoreFunction) - .populationModel(new PopulationModel.Builder().populationSize(5).build()) - .build(); - } else if (dataApproach == 1) { - //DataProvider approach - dp = new TestDL4JLocalExecution.MnistDataProvider(); - - candidateGenerator = new GeneticSearchCandidateGenerator.Builder(mls, scoreFunction) - .populationModel(new PopulationModel.Builder().populationSize(5).build()) - .build(); - } else { - //Factory approach - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - candidateGenerator = new GeneticSearchCandidateGenerator.Builder(mls, scoreFunction) - .dataParameters(commands) - .populationModel(new PopulationModel.Builder().populationSize(5).build()) - .build(); - dp = new DataSetIteratorFactoryProvider(); - } - - File f = testDir.newFolder(); - File modelSave = new File(f, "modelSaveDir"); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dp) - .dataSource(ds, dsP) - .modelSaver(new FileModelSaver(modelSave)) - .scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(20, TimeUnit.SECONDS), - new MaxCandidatesCondition(3)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new ComputationGraphTaskCreator(new ClassificationEvaluator())); - - runner.execute(); - - List results = runner.getResults(); - assertTrue(results.size() > 0); - -// System.out.println("----- COMPLETE - " + results.size() + " results -----"); - } - } - - public static class TestMdsDataProvider implements DataProvider { - private int numEpochs; - private int batchSize; - - public TestMdsDataProvider(@JsonProperty("numEpochs") int numEpochs, @JsonProperty("batchSize") int batchSize) { - this.numEpochs = numEpochs; - this.batchSize = batchSize; - } - - private TestMdsDataProvider() { - } - - - @Override - public Object trainData(Map dataParameters) { - try { - DataSetIterator underlying = new MnistDataSetIterator(batchSize, Math.min(60000, 10 * batchSize), false, true, true, 12345); - return new MultiDataSetIteratorAdapter(new MultipleEpochsIterator(numEpochs, underlying)); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - @Override - public Object testData(Map dataParameters) { - try { - DataSetIterator underlying = new MnistDataSetIterator(batchSize, Math.min(10000, 5 * batchSize), false, false, false, 12345); - return new MultiDataSetIteratorAdapter(underlying); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return MultiDataSetIterator.class; - } - } - - private static class ScoreProvider implements Supplier, Serializable { - @Override - public ScoreCalculator get() { - try { - return new DataSetLossCalculatorCG(new MnistDataSetIterator(128, 1280), true); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/json/TestJson.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/json/TestJson.java deleted file mode 100644 index da588d9dd..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/json/TestJson.java +++ /dev/null @@ -1,265 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.json; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.ComputationGraphSpace; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.conf.updater.AdaMaxSpace; -import org.deeplearning4j.arbiter.conf.updater.AdamSpace; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxTimeCondition; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.deeplearning4j.arbiter.scoring.RegressionValue; -import org.deeplearning4j.arbiter.scoring.ScoreFunctions; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.arbiter.util.TestDataFactoryProviderMnist; -import org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.earlystopping.saver.InMemoryModelSaver; -import org.deeplearning4j.earlystopping.scorecalc.DataSetLossCalculatorCG; -import org.deeplearning4j.earlystopping.termination.MaxEpochsTerminationCondition; -import org.deeplearning4j.nn.api.OptimizationAlgorithm; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.junit.Test; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -import java.util.HashMap; -import java.util.Map; -import java.util.Properties; -import java.util.concurrent.TimeUnit; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertNotNull; - -/** - * Created by Alex on 14/02/2017. - */ -public class TestJson extends BaseDL4JTest { - - @Test - public void testMultiLayerSpaceJson() { - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.05)) - .addLayer(new DenseLayerSpace.Builder().nIn(1).nOut(new IntegerParameterSpace(5, 30)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.SOFTPLUS, - Activation.LEAKYRELU)) - .build(), new IntegerParameterSpace(1, 2), true) //1-2 identical layers - .addLayer(new DenseLayerSpace.Builder().nIn(4).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), new IntegerParameterSpace(0, 1), true) //0 to 1 layers - .addLayer(new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .iLossFunction(LossFunctions.LossFunction.MCXENT.getILossFunction()).build()) - .setInputType(InputType.convolutional(28, 28, 1)).build(); - - String asJson = mls.toJson(); - // System.out.println(asJson); - - MultiLayerSpace fromJson = MultiLayerSpace.fromJson(asJson); - - assertEquals(mls, fromJson); - } - - - - @Test - public void testOptimizationFromJson() { - EarlyStoppingConfiguration esConf = - new EarlyStoppingConfiguration.Builder() - .epochTerminationConditions(new MaxEpochsTerminationCondition(100)) - .scoreCalculator(new DataSetLossCalculatorCG(new IrisDataSetIterator(150, 150), - true)) - .modelSaver(new InMemoryModelSaver()).build(); - - //Define: network config (hyperparameter space) - ComputationGraphSpace cgs = new ComputationGraphSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new AdaMaxSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)).addInputs("in") - .setInputTypes(InputType.feedForward(4)) - .addLayer("first", - new DenseLayerSpace.Builder().nIn(4).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), - "in") //1-2 identical layers (except nIn) - .addLayer("out", new OutputLayerSpace.Builder().nOut(3).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "first") - .setOutputs("out").earlyStoppingConfiguration(esConf).build(); - - //Define configuration: - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(cgs, commands); - DataProvider dataProvider = new DataSetIteratorFactoryProvider(); - - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder().candidateGenerator(candidateGenerator) - .dataProvider(dataProvider).scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(2, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - String json = configuration.toJson(); - OptimizationConfiguration loadConf = OptimizationConfiguration.fromJson(json); - assertEquals(configuration, loadConf); - } - - @Test - public void testOptimizationFromJsonDataSource() { - for(boolean withProperties : new boolean[]{false, true}) { - //Define: network config (hyperparameter space) - ComputationGraphSpace cgs = new ComputationGraphSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new AdaMaxSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)).addInputs("in") - .setInputTypes(InputType.feedForward(4)) - .addLayer("first", - new DenseLayerSpace.Builder().nIn(4).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), - "in") //1-2 identical layers (except nIn) - .addLayer("out", new OutputLayerSpace.Builder().nOut(3).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "first") - .setOutputs("out").build(); - - //Define configuration: - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(cgs, commands); - - Properties p = new Properties(); - p.setProperty("minibatch", "16"); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder().candidateGenerator(candidateGenerator) - .dataSource(MnistDataSource.class, (withProperties ? p : null)) - .scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(2, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - String json = configuration.toJson(); - OptimizationConfiguration loadConf = OptimizationConfiguration.fromJson(json); - assertEquals(configuration, loadConf); - assertNotNull(loadConf.getDataSource()); - if(withProperties){ - assertNotNull(loadConf.getDataSourceProperties()); - } - } - } - - @Test - public void testComputationGraphSpaceJson() { - ParameterSpace p = new IntegerParameterSpace(10, 100); - ComputationGraphSpace cgs = - new ComputationGraphSpace.Builder() - .updater(new AdamSpace(new DiscreteParameterSpace<>(0.1, 0.5, 1.0))) - .seed(12345).addInputs("in") - .addLayer("0", new DenseLayerSpace.Builder() - .nIn(new IntegerParameterSpace(1, 100)).nOut(p).build(), "in") - .addLayer("1", new DenseLayerSpace.Builder().nIn(p).nOut(10).build(), "0") - .addLayer("2", new OutputLayerSpace.Builder().iLossFunction( - LossFunctions.LossFunction.MCXENT.getILossFunction()).nIn(10) - .nOut(5).build(), "1") - .setOutputs("2").build(); - - String asJson = cgs.toJson(); - ComputationGraphSpace fromJson = ComputationGraphSpace.fromJson(asJson); - - assertEquals(cgs, fromJson); - } - - @Test - public void testScoreFunctionJson() throws Exception { - - ScoreFunction[] scoreFunctions = new ScoreFunction[]{ - ScoreFunctions.testSetAccuracy(), ScoreFunctions.testSetF1(), - ScoreFunctions.testSetLoss(true), ScoreFunctions.testSetRegression(RegressionValue.MAE), - ScoreFunctions.testSetRegression(RegressionValue.RMSE)}; - - for(ScoreFunction sc : scoreFunctions){ - String json = JsonMapper.getMapper().writeValueAsString(sc); - ScoreFunction fromJson = JsonMapper.getMapper().readValue(json, ScoreFunction.class); - - assertEquals(sc, fromJson); - } - } - - - public static class MnistDataSource implements DataSource { - private int minibatch; - - public MnistDataSource(){ - - } - - @Override - public void configure(Properties properties) { - this.minibatch = Integer.parseInt(properties.getProperty("minibatch", "16")); - } - - @Override - public Object trainData() { - try { - return new MnistDataSetIterator(minibatch, true, 12345); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - - @Override - public Object testData() { - try { - return new MnistDataSetIterator(minibatch, true, 12345); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/MNISTOptimizationTest.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/MNISTOptimizationTest.java deleted file mode 100644 index ffc76c495..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/MNISTOptimizationTest.java +++ /dev/null @@ -1,168 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.multilayernetwork; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.layers.ConvolutionLayerSpace; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxTimeCondition; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.arbiter.task.MultiLayerNetworkTaskCreator; -import org.deeplearning4j.arbiter.util.TestDataFactoryProviderMnist; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.earlystopping.saver.InMemoryModelSaver; -import org.deeplearning4j.earlystopping.scorecalc.DataSetLossCalculator; -import org.deeplearning4j.earlystopping.termination.MaxEpochsTerminationCondition; -import org.deeplearning4j.earlystopping.termination.MaxScoreIterationTerminationCondition; -import org.deeplearning4j.earlystopping.termination.MaxTimeIterationTerminationCondition; -import org.deeplearning4j.nn.api.OptimizationAlgorithm; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -import java.io.File; -import java.util.HashMap; -import java.util.Map; -import java.util.concurrent.TimeUnit; - -// import org.deeplearning4j.arbiter.optimize.ui.ArbiterUIServer; -// import org.deeplearning4j.arbiter.optimize.ui.listener.UIOptimizationRunnerStatusListener; - -/** Not strictly a unit test. Rather: part example, part debugging on MNIST */ -public class MNISTOptimizationTest extends BaseDL4JTest { - - public static void main(String[] args) throws Exception { - EarlyStoppingConfiguration esConf = - new EarlyStoppingConfiguration.Builder() - .epochTerminationConditions(new MaxEpochsTerminationCondition(3)) - .iterationTerminationConditions( - new MaxTimeIterationTerminationCondition(5, TimeUnit.MINUTES), - new MaxScoreIterationTerminationCondition(4.6) //Random score: -log_e(0.1) ~= 2.3 - ).scoreCalculator(new DataSetLossCalculator(new MnistDataSetIterator(64, 2000, false, false, true, 123), true)).modelSaver(new InMemoryModelSaver()).build(); - - //Define: network config (hyperparameter space) - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.05)) - .addLayer( - new ConvolutionLayerSpace.Builder().nIn(1) - .nOut(new IntegerParameterSpace(5, 30)) - .kernelSize(new DiscreteParameterSpace<>(new int[] {3, 3}, - new int[] {4, 4}, new int[] {5, 5})) - .stride(new DiscreteParameterSpace<>(new int[] {1, 1}, - new int[] {2, 2})) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.SOFTPLUS, Activation.LEAKYRELU)) - .build(), - new IntegerParameterSpace(1, 2)) //1-2 identical layers - .addLayer(new DenseLayerSpace.Builder().nIn(4).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), new IntegerParameterSpace(0, 1)) //0 to 1 layers - .addLayer(new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .earlyStoppingConfiguration(esConf).build(); - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, commands); - DataProvider dataProvider = new MnistDataSetProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterMNISTSmall\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)).scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(120, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - - // ArbiterUIServer server = ArbiterUIServer.getInstance(); - // runner.addListeners(new UIOptimizationRunnerStatusListener(server)); - - runner.execute(); - - - System.out.println("----- COMPLETE -----"); - } - - - private static class MnistDataSetProvider implements DataProvider { - - @Override - public DataSetIterator trainData(Map dataParameters) { - try { - if (dataParameters == null || dataParameters.isEmpty()) { - return new MnistDataSetIterator(64, 10000, false, true, true, 123); - } - if (dataParameters.containsKey("batchsize")) { - int b = (Integer) dataParameters.get("batchsize"); - return new MnistDataSetIterator(b, 10000, false, true, true, 123); - } - return new MnistDataSetIterator(64, 10000, false, true, true, 123); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public DataSetIterator testData(Map dataParameters) { - return trainData(dataParameters); - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - - @Override - public String toString() { - return "MnistDataSetProvider()"; - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/MnistDataSetIteratorFactory.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/MnistDataSetIteratorFactory.java deleted file mode 100644 index 563971930..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/MnistDataSetIteratorFactory.java +++ /dev/null @@ -1,44 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.multilayernetwork; - -import lombok.Data; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIteratorFactory; - -import java.io.IOException; - -/** - * Created by agibsonccc on 3/13/17. - */ -@Data -public class MnistDataSetIteratorFactory implements DataSetIteratorFactory { - /** - * @return - */ - @Override - public DataSetIterator create() { - try { - return new MnistDataSetIterator(1000, 1000); - } catch (IOException e) { - throw new RuntimeException(e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestDL4JLocalExecution.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestDL4JLocalExecution.java deleted file mode 100644 index acd4a2c72..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestDL4JLocalExecution.java +++ /dev/null @@ -1,384 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.multilayernetwork; - -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.evaluator.multilayer.ClassificationEvaluator; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OCNNLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxTimeCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.generator.GridSearchCandidateGenerator; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.arbiter.task.MultiLayerNetworkTaskCreator; -import org.deeplearning4j.arbiter.util.TestDataFactoryProviderMnist; -import org.deeplearning4j.datasets.iterator.EarlyTerminationDataSetIterator; -import org.deeplearning4j.datasets.iterator.impl.IrisDataSetIterator; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.deeplearning4j.earlystopping.EarlyStoppingConfiguration; -import org.deeplearning4j.earlystopping.saver.InMemoryModelSaver; -import org.deeplearning4j.earlystopping.scorecalc.DataSetLossCalculator; -import org.deeplearning4j.earlystopping.termination.MaxEpochsTerminationCondition; -import org.deeplearning4j.nn.api.OptimizationAlgorithm; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.junit.BeforeClass; -import org.junit.Ignore; -import org.junit.Rule; -import org.junit.Test; -import org.junit.rules.TemporaryFolder; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -import java.io.File; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Properties; -import java.util.concurrent.TimeUnit; - -import static org.junit.Assert.assertTrue; - -@Slf4j -public class TestDL4JLocalExecution extends BaseDL4JTest { - - @Rule - public TemporaryFolder testDir = new TemporaryFolder(); - - @BeforeClass - public static void before(){ - Nd4j.setDefaultDataTypes(DataType.FLOAT, DataType.FLOAT); - } - - @Test - public void testLocalExecution() throws Exception { - - for( int dataApproach = 0; dataApproach<3; dataApproach++ ) { - log.info("////////////////// Starting Test: {} ///////////////////", dataApproach); - - //Define: network config (hyperparameter space) - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)) - .addLayer( - new DenseLayerSpace.Builder().nIn(784).nOut(new IntegerParameterSpace(10, 20)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build()) //1-2 identical layers (except nIn) - .addLayer(new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .numEpochs(3).build(); - - DataProvider dp = null; - Class ds = null; - Properties dsP = null; - CandidateGenerator candidateGenerator; - - if(dataApproach == 0){ - ds = MnistDataSource.class; - dsP = new Properties(); - dsP.setProperty("minibatch", "2"); - candidateGenerator = new RandomSearchGenerator(mls); - } else if(dataApproach == 1) { - //DataProvider approach - dp = new MnistDataProvider(); - candidateGenerator = new RandomSearchGenerator(mls); - } else { - //Factory approach - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - candidateGenerator = new RandomSearchGenerator(mls, commands); - dp = new DataSetIteratorFactoryProvider(); - } - - File f = testDir.newFolder(); - File modelSave = new File(f, "modelSaveDir"); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dp) - .dataSource(ds, dsP) - .modelSaver(new FileModelSaver(modelSave)) - .scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(5, TimeUnit.SECONDS), - new MaxCandidatesCondition(5)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, - new MultiLayerNetworkTaskCreator(new ClassificationEvaluator())); - - runner.execute(); - - List results = runner.getResults(); - assertTrue(results.size() > 0); - - System.out.println("----- COMPLETE - " + results.size() + " results -----"); - } - } - - public static class MnistDataSource implements DataSource { - private int minibatch; - - public MnistDataSource(){ - - } - - @Override - public void configure(Properties properties) { - this.minibatch = Integer.parseInt(properties.getProperty("minibatch", "16")); - } - - @Override - public Object trainData() { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(minibatch, true, 12345), 3); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - - @Override - public Object testData() { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(minibatch, true, 12345), 3); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - } - - public static class MnistDataProvider implements DataProvider { - private int minibatch = 8; - - @Override - public Object trainData(Map dataParameters) { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(minibatch, true, 12345), 3); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - - @Override - public Object testData(Map dataParameters) { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(minibatch, true, 12345), 3); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - } - - @Test - @org.junit.Ignore - public void testLocalExecutionGridSearch() throws Exception { - - //Define: network config (hyperparameter space) - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)) - .addLayer( - new DenseLayerSpace.Builder().nIn(4).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), - new IntegerParameterSpace(1, 2)) //1-2 identical layers (except nIn) - .addLayer(new OutputLayerSpace.Builder().nOut(3).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .numEpochs(3).build(); - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - CandidateGenerator candidateGenerator = new GridSearchCandidateGenerator(mls, 5, - GridSearchCandidateGenerator.Mode.Sequential, commands); - DataProvider dataProvider = new DataSetIteratorFactoryProvider(); - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterDL4JTest/").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - f.deleteOnExit(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)).scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(2, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, - new MultiLayerNetworkTaskCreator(new ClassificationEvaluator())); - - runner.execute(); - - System.out.println("----- COMPLETE -----"); - } - - @Test - @Ignore - public void testLocalExecutionEarlyStopping() throws Exception { - EarlyStoppingConfiguration esConf = new EarlyStoppingConfiguration.Builder() - .epochTerminationConditions(new MaxEpochsTerminationCondition(100)) - .scoreCalculator(new DataSetLossCalculator(new IrisDataSetIterator(150, 150), true)) - .modelSaver(new InMemoryModelSaver()).build(); - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - - //Define: network config (hyperparameter space) - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)) - .addLayer(new DenseLayerSpace.Builder().nIn(4).nOut(new IntegerParameterSpace(2, 10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), - new IntegerParameterSpace(1, 2)) //1-2 identical layers (except nIn) - .addLayer(new OutputLayerSpace.Builder().nOut(3).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .earlyStoppingConfiguration(esConf).build(); - - //Define configuration: - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, commands); - DataProvider dataProvider = new DataSetIteratorFactoryProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterDL4JTest2\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - f.deleteOnExit(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)).scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(2, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, - new MultiLayerNetworkTaskCreator(new ClassificationEvaluator())); - - runner.execute(); - System.out.println("----- COMPLETE -----"); - } - - - @Test - public void testOcnn() { - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - - //Define: network config (hyperparameter space) - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)) - .addLayer( - new DenseLayerSpace.Builder().nOut(new IntegerParameterSpace(250, 500)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), - new IntegerParameterSpace(1, 2)) //1-2 identical layers (except nIn) - .addLayer(new OCNNLayerSpace.Builder().nu(new ContinuousParameterSpace(0.0001, 0.1)) - .numHidden(new DiscreteParameterSpace(784 / 2,784 / 4)) - .activation(Activation.HARDSIGMOID) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .setInputType(InputType.convolutionalFlat(28,28,1)) - .build(); - - //Define configuration: - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, commands); - DataProvider dataProvider = new DataSetIteratorFactoryProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterDL4JTest3\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - f.deleteOnExit(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)).scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(2, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - - //candidate generation: uncomment execute if you want to run - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, - new MultiLayerNetworkTaskCreator(new ClassificationEvaluator())); - - Candidate candidate = candidateGenerator.getCandidate(); - - // runner.execute(); - System.out.println("----- COMPLETE -----"); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestErrors.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestErrors.java deleted file mode 100644 index 473adb488..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestErrors.java +++ /dev/null @@ -1,159 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.multilayernetwork; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.ComputationGraphSpace; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.arbiter.task.MultiLayerNetworkTaskCreator; -import org.deeplearning4j.arbiter.util.TestDataProviderMnist; -import org.junit.Rule; -import org.junit.Test; -import org.junit.rules.TemporaryFolder; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -import java.io.File; - -public class TestErrors extends BaseDL4JTest { - - @Rule - public TemporaryFolder temp = new TemporaryFolder(); - - @Test(timeout = 20000L) - public void testAllInvalidConfig() throws Exception { - //Invalid config - basically check that this actually terminates - - File f = temp.newFolder(); - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .addLayer(new DenseLayerSpace.Builder().nIn(4).nOut(new FixedValue<>(0)) //INVALID: nOut of 0 - .activation(Activation.TANH) - .build()) - .addLayer(new OutputLayerSpace.Builder().nOut(3).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .build(); - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(new TestDataProviderMnist(32, 3)) - .modelSaver(new FileModelSaver(f)).scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions( - new MaxCandidatesCondition(5)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration); - runner.execute(); - } - - - @Test(timeout = 20000L) - public void testAllInvalidDataConfigMismatch() throws Exception { - //Valid config - but mismatched with provided data - - File f = temp.newFolder(); - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .addLayer(new DenseLayerSpace.Builder().nIn(4).nOut(10) //INVALID: nOut of 0 - .activation(Activation.TANH) - .build()) - .addLayer(new OutputLayerSpace.Builder().nIn(10).nOut(3).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .build(); - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(new TestDataProviderMnist(32, 3)) - .modelSaver(new FileModelSaver(f)).scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions( - new MaxCandidatesCondition(5)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration); - runner.execute(); - } - - - @Test(timeout = 20000L) - public void testAllInvalidConfigCG() throws Exception { - //Invalid config - basically check that this actually terminates - - File f = temp.newFolder(); - ComputationGraphSpace mls = new ComputationGraphSpace.Builder() - .addInputs("in") - .layer("0", new DenseLayerSpace.Builder().nIn(4).nOut(new FixedValue<>(0)) //INVALID: nOut of 0 - .activation(Activation.TANH) - .build(), "in") - .layer("1", new OutputLayerSpace.Builder().nOut(3).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "0") - .setOutputs("1") - .build(); - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(new TestDataProviderMnist(32, 3)) - .modelSaver(new FileModelSaver(f)).scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxCandidatesCondition(5)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration); - runner.execute(); - } - - - @Test(timeout = 20000L) - public void testAllInvalidDataConfigMismatchCG() throws Exception { - //Valid config - but mismatched with provided data - - File f = temp.newFolder(); - ComputationGraphSpace mls = new ComputationGraphSpace.Builder() - .addInputs("in") - .layer("0", new DenseLayerSpace.Builder().nIn(4).nOut(10) - .activation(Activation.TANH).build(), "in") - .addLayer("1", new OutputLayerSpace.Builder().nIn(10).nOut(3).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "0") - .setOutputs("1") - .build(); - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls); - - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(new TestDataProviderMnist(32, 3)) - .modelSaver(new FileModelSaver(f)).scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions( - new MaxCandidatesCondition(5)) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - runner.execute(); - } - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestLayerSpace.java deleted file mode 100644 index 609158e8c..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestLayerSpace.java +++ /dev/null @@ -1,323 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.multilayernetwork; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertFalse; -import static org.junit.Assert.assertTrue; - -import org.apache.commons.lang3.ArrayUtils; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.TestUtils; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.layers.ActivationLayerSpace; -import org.deeplearning4j.arbiter.layers.BatchNormalizationSpace; -import org.deeplearning4j.arbiter.layers.ConvolutionLayerSpace; -import org.deeplearning4j.arbiter.layers.Deconvolution2DLayerSpace; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.EmbeddingLayerSpace; -import org.deeplearning4j.arbiter.layers.GravesBidirectionalLSTMLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.BooleanSpace; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.nn.api.layers.LayerConstraint; -import org.deeplearning4j.nn.conf.constraint.MaxNormConstraint; -import org.deeplearning4j.nn.conf.constraint.MinMaxNormConstraint; -import org.deeplearning4j.nn.conf.constraint.NonNegativeConstraint; -import org.deeplearning4j.nn.conf.constraint.UnitNormConstraint; -import org.deeplearning4j.nn.conf.layers.ActivationLayer; -import org.deeplearning4j.nn.conf.layers.BatchNormalization; -import org.deeplearning4j.nn.conf.layers.Convolution2D; -import org.deeplearning4j.nn.conf.layers.ConvolutionLayer; -import org.deeplearning4j.nn.conf.layers.Deconvolution2D; -import org.deeplearning4j.nn.conf.layers.DenseLayer; -import org.deeplearning4j.nn.conf.layers.EmbeddingLayer; -import org.deeplearning4j.nn.conf.layers.GravesBidirectionalLSTM; -import org.junit.Test; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; -import org.nd4j.linalg.learning.config.Sgd; - -import java.util.ArrayList; -import java.util.Collections; -import java.util.List; -import java.util.Random; - -public class TestLayerSpace extends BaseDL4JTest { - - @Test - public void testBasic1() { - DenseLayer expected = new DenseLayer.Builder().nOut(13).activation(Activation.RELU).build(); - - DenseLayerSpace space = new DenseLayerSpace.Builder().nOut(13).activation(Activation.RELU).build(); - - int nParam = space.numParameters(); - assertEquals(0, nParam); - DenseLayer actual = space.getValue(new double[nParam]); - - assertEquals(expected, actual); - } - - @Test - public void testBasic2() { - - Activation[] actFns = new Activation[]{Activation.SOFTSIGN, Activation.RELU, Activation.LEAKYRELU}; - Random r = new Random(12345); - - for (int i = 0; i < 20; i++) { - - new DenseLayer.Builder().build(); - - DenseLayerSpace ls = - new DenseLayerSpace.Builder().nOut(20) - .updater(new SgdSpace(new ContinuousParameterSpace(0.3, 0.4))) - .l2(new ContinuousParameterSpace(0.01, 0.1)) - .activation(new DiscreteParameterSpace<>(actFns)).build(); - - //Set the parameter numbers... - List list = ls.collectLeaves(); - int k = 0; - for (int j = 0; j < list.size(); j++) { - if (list.get(j).numParameters() > 0) { - list.get(j).setIndices(k++); - } - } - - int nParam = ls.numParameters(); - assertEquals(3, nParam); - - double[] d = new double[nParam]; - for (int j = 0; j < d.length; j++) { - d[j] = r.nextDouble(); - } - - DenseLayer l = ls.getValue(d); - - assertEquals(20, l.getNOut()); - double lr = ((Sgd) l.getIUpdater()).getLearningRate(); - double l2 = TestUtils.getL2(l); - IActivation activation = l.getActivationFn(); - -// System.out.println(lr + "\t" + l2 + "\t" + activation); - - assertTrue(lr >= 0.3 && lr <= 0.4); - assertTrue(l2 >= 0.01 && l2 <= 0.1); - assertTrue(containsActivationFunction(actFns, activation)); - } - } - - @Test - public void testBatchNorm() { - BatchNormalizationSpace sp = new BatchNormalizationSpace.Builder().gamma(1.5) - .beta(new ContinuousParameterSpace(2, 3)).lockGammaBeta(true).build(); - - //Set the parameter numbers... - List list = sp.collectLeaves(); - int k = 0; - for (int j = 0; j < list.size(); j++) { - if (list.get(j).numParameters() > 0) { - list.get(j).setIndices(k++); - } - } - - BatchNormalization bn = sp.getValue(new double[]{0.6}); - assertTrue(bn.isLockGammaBeta()); - assertEquals(1.5, bn.getGamma(), 0.0); - assertEquals(0.6 * (3 - 2) + 2, bn.getBeta(), 1e-4); - } - - @Test - public void testBatchNormConstrain() { - - ArrayList> constrainListOptions = new ArrayList>(); - constrainListOptions.add(Collections.singletonList((LayerConstraint) new MaxNormConstraint(0.5, 1))); - constrainListOptions.add(Collections.singletonList((LayerConstraint) new MinMaxNormConstraint(0.3, 0.4, 1.0, 1))); - constrainListOptions.add(Collections.singletonList((LayerConstraint) new NonNegativeConstraint())); - constrainListOptions.add(Collections.singletonList((LayerConstraint) new UnitNormConstraint(1))); - - DiscreteParameterSpace> constrainParamSpace = new DiscreteParameterSpace<>(constrainListOptions); - BatchNormalizationSpace sp = new BatchNormalizationSpace.Builder().gamma(1.5) - .beta(0.6).lockGammaBeta(true).constrainBeta(constrainParamSpace).constrainGamma(new NonNegativeConstraint()).build(); - - BatchNormalization bnExpected = new BatchNormalization.Builder().gamma(1.5) - .beta(0.6).lockGammaBeta(true).constrainBeta(new NonNegativeConstraint()).constrainGamma(new NonNegativeConstraint()).build(); - //Set the parameter numbers... - List list = sp.collectLeaves(); - int k = 0; - for( - int j = 0; j 0) { - list.get(j).setIndices(k++); - } - } - - assertEquals(1,sp.getNumParameters()); - BatchNormalization bn = sp.getValue(new double[]{0.6}); - assertEquals(bnExpected,bn); //0.6 should pick the 3rd value in discrete param space - - //assertEquals(bn.getConstraints().size(),2); This throws an NPE but I believe this is an issue with actual impl of BatchNormalization not arbiter -} - - @Test - public void testActivationLayer() { - Activation[] actFns = new Activation[]{Activation.SOFTSIGN, Activation.RELU, Activation.LEAKYRELU}; - - ActivationLayerSpace als = - new ActivationLayerSpace.Builder().activation(new DiscreteParameterSpace<>(actFns)).build(); - //Set the parameter numbers... - List list = als.collectLeaves(); - for (int j = 0; j < list.size(); j++) { - list.get(j).setIndices(j); - } - - int nParam = als.numParameters(); - assertEquals(1, nParam); - - Random r = new Random(12345); - - for (int i = 0; i < 20; i++) { - - double[] d = new double[nParam]; - for (int j = 0; j < d.length; j++) { - d[j] = r.nextDouble(); - } - - ActivationLayer al = als.getValue(d); - IActivation activation = al.getActivationFn(); - - assertTrue(containsActivationFunction(actFns, activation)); - } - } - - @Test - public void testEmbeddingLayer() { - - Activation[] actFns = new Activation[]{Activation.SOFTSIGN, Activation.RELU, Activation.LEAKYRELU}; - - EmbeddingLayerSpace els = new EmbeddingLayerSpace.Builder().activation(new DiscreteParameterSpace<>(actFns)) - .nIn(10).nOut(new IntegerParameterSpace(10, 20)).build(); - //Set the parameter numbers... - List list = els.collectLeaves(); - int k = 0; - for (int j = 0; j < list.size(); j++) { - if (list.get(j).numParameters() > 0) { - list.get(j).setIndices(k++); - } - } - - int nParam = els.numParameters(); - assertEquals(2, nParam); - - Random r = new Random(12345); - - for (int i = 0; i < 20; i++) { - - double[] d = new double[nParam]; - for (int j = 0; j < d.length; j++) { - d[j] = r.nextDouble(); - } - - EmbeddingLayer el = els.getValue(d); - IActivation activation = el.getActivationFn(); - long nOut = el.getNOut(); - - assertTrue(containsActivationFunction(actFns, activation)); - assertTrue(nOut >= 10 && nOut <= 20); - } - } - - @Test - public void testSimpleConv() { - ConvolutionLayer conv2d = new Convolution2D.Builder().dilation(1,2).kernelSize(2,2).nIn(2).nOut(3).build(); - ConvolutionLayerSpace conv2dSpace = new ConvolutionLayerSpace.Builder().dilation(1,2).kernelSize(2,2).nIn(2).nOut(3).build(); - assertEquals(0,conv2dSpace.getNumParameters()); - assertEquals(conv2d, conv2dSpace.getValue(new double[0])); - - Deconvolution2DLayerSpace deconvd2dls = new Deconvolution2DLayerSpace.Builder().dilation(2,1).nIn(2).nOut(2).hasBias(new BooleanSpace()).build(); - assertEquals(1, deconvd2dls.getNumParameters()); - //Set the parameter numbers... - List list = deconvd2dls.collectLeaves(); - int k = 0; - for( - int j = 0; j 0) { - list.get(j).setIndices(k++); - } - } - Deconvolution2D actual = deconvd2dls.getValue(new double[]{0.9}); - assertFalse(actual.hasBias()); - assertEquals(ArrayUtils.toString(new int[] {2,1} ),ArrayUtils.toString(actual.getDilation())); - } - - @Test - public void testGravesBidirectionalLayer() { - - Activation[] actFns = new Activation[]{Activation.SOFTSIGN, Activation.RELU, Activation.LEAKYRELU}; - - GravesBidirectionalLSTMLayerSpace ls = - new GravesBidirectionalLSTMLayerSpace.Builder().activation(new DiscreteParameterSpace<>(actFns)) - .forgetGateBiasInit(new ContinuousParameterSpace(0.5, 0.8)).nIn(10) - .nOut(new IntegerParameterSpace(10, 20)).build(); - //Set the parameter numbers... - List list = ls.collectLeaves(); - int k = 0; - for (int j = 0; j < list.size(); j++) { - if (list.get(j).numParameters() > 0) { - list.get(j).setIndices(k++); - } - } - - int nParam = ls.numParameters(); - assertEquals(3, nParam); //Excluding fixed value for nIn - - Random r = new Random(12345); - - for (int i = 0; i < 20; i++) { - - double[] d = new double[nParam]; - for (int j = 0; j < d.length; j++) { - d[j] = r.nextDouble(); - } - - GravesBidirectionalLSTM el = ls.getValue(d); - IActivation activation = el.getActivationFn(); - long nOut = el.getNOut(); - double forgetGate = el.getForgetGateBiasInit(); - - assertTrue(containsActivationFunction(actFns, activation)); - assertTrue(nOut >= 10 && nOut <= 20); - assertTrue(forgetGate >= 0.5 && forgetGate <= 0.8); - } - } - - private static boolean containsActivationFunction(Activation[] activationFunctions, - IActivation activationFunction) { - for (Activation af : activationFunctions) { - if (activationFunction.equals(af.getActivationFunction())) - return true; - } - return false; - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestMultiLayerSpace.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestMultiLayerSpace.java deleted file mode 100644 index 226d3a471..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestMultiLayerSpace.java +++ /dev/null @@ -1,829 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.multilayernetwork; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.DL4JConfiguration; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.TestUtils; -import org.deeplearning4j.arbiter.conf.updater.AdamSpace; -import org.deeplearning4j.arbiter.conf.updater.NesterovsSpace; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.layers.*; -import org.deeplearning4j.arbiter.optimize.api.Candidate; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultSaver; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.TerminationCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.generator.GridSearchCandidateGenerator; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.parameter.FixedValue; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.math.MathOp; -import org.deeplearning4j.arbiter.optimize.parameter.math.Op; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.impl.TestSetAccuracyScoreFunction; -import org.deeplearning4j.arbiter.task.MultiLayerNetworkTaskCreator; -import org.deeplearning4j.arbiter.util.LeafUtils; -import org.deeplearning4j.datasets.iterator.ExistingDataSetIterator; -import org.deeplearning4j.nn.conf.ConvolutionMode; -import org.deeplearning4j.nn.conf.MultiLayerConfiguration; -import org.deeplearning4j.nn.conf.NeuralNetConfiguration; -import org.deeplearning4j.nn.conf.constraint.NonNegativeConstraint; -import org.deeplearning4j.nn.conf.constraint.UnitNormConstraint; -import org.deeplearning4j.nn.conf.dropout.Dropout; -import org.deeplearning4j.nn.conf.dropout.IDropout; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.deeplearning4j.nn.conf.layers.BaseLayer; -import org.deeplearning4j.nn.conf.layers.ConvolutionLayer; -import org.deeplearning4j.nn.conf.layers.DenseLayer; -import org.deeplearning4j.nn.conf.layers.FeedForwardLayer; -import org.deeplearning4j.nn.conf.layers.GlobalPoolingLayer; -import org.deeplearning4j.nn.conf.layers.GravesLSTM; -import org.deeplearning4j.nn.conf.layers.LSTM; -import org.deeplearning4j.nn.conf.layers.OutputLayer; -import org.deeplearning4j.nn.conf.layers.PoolingType; -import org.deeplearning4j.nn.conf.layers.SubsamplingLayer; -import org.deeplearning4j.nn.conf.layers.variational.BernoulliReconstructionDistribution; -import org.deeplearning4j.nn.conf.layers.variational.GaussianReconstructionDistribution; -import org.deeplearning4j.nn.conf.layers.variational.ReconstructionDistribution; -import org.deeplearning4j.nn.conf.layers.variational.VariationalAutoencoder; -import org.deeplearning4j.nn.layers.recurrent.BidirectionalLayer; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.deeplearning4j.nn.weights.WeightInit; -import org.junit.BeforeClass; -import org.junit.Rule; -import org.junit.Test; -import org.junit.rules.TemporaryFolder; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.activations.IActivation; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.dataset.DataSet; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.linalg.learning.config.Nesterovs; -import org.nd4j.linalg.learning.config.Sgd; -import org.nd4j.linalg.lossfunctions.ILossFunction; -import org.nd4j.linalg.lossfunctions.LossFunctions; -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction; -import org.nd4j.linalg.lossfunctions.impl.LossMCXENT; -import org.nd4j.linalg.lossfunctions.impl.LossMSE; -import org.nd4j.common.primitives.Pair; - -import java.io.File; -import java.lang.reflect.Field; -import java.util.*; - -import static org.junit.Assert.*; - -public class TestMultiLayerSpace extends BaseDL4JTest { - - @Rule - public TemporaryFolder testDir = new TemporaryFolder(); - - @BeforeClass - public static void before(){ - Nd4j.setDefaultDataTypes(DataType.FLOAT, DataType.FLOAT); - } - - @Test - public void testBasic() { - - MultiLayerConfiguration expected = - new NeuralNetConfiguration.Builder() - .updater(new Sgd(0.005)).seed(12345).list() - .layer(0, new DenseLayer.Builder().nIn(10).nOut(10).build()) - .layer(1, new DenseLayer.Builder().nIn(10).nOut(10).build()).layer(2, - new OutputLayer.Builder().lossFunction(LossFunction.MCXENT) - .activation(Activation.SOFTMAX).nIn(10).nOut(5).build()) - - .build(); - - MultiLayerSpace mls = - new MultiLayerSpace.Builder() - .updater(new Sgd(0.005)).seed(12345) - .addLayer(new DenseLayerSpace.Builder().nIn(10).nOut(10).build(), - new FixedValue<>(2)) //2 identical layers - .addLayer(new OutputLayerSpace.Builder().lossFunction(LossFunction.MCXENT) - .activation(Activation.SOFTMAX) - .nIn(10).nOut(5).build()).build(); - - int nParams = mls.numParameters(); - assertEquals(0, nParams); - - MultiLayerConfiguration conf = mls.getValue(new double[0]).getMultiLayerConfiguration(); - - assertEquals(expected, conf); - } - - @Test - public void testBasic0() { - MultiLayerConfiguration expected = - new NeuralNetConfiguration.Builder() - .l1Bias(0.4) - .l2Bias(0.5) - .constrainBias(new NonNegativeConstraint()) - .updater(new Sgd(0.005)).seed(12345).list() - .layer(0, new DenseLayer.Builder().l1Bias(0.6).nIn(10).nOut(10).build()) - .layer(1, new DenseLayer.Builder().l2Bias(0.7).constrainBias(new UnitNormConstraint()).nIn(10).nOut(10).build()).layer(2, - new OutputLayer.Builder().lossFunction(LossFunction.MCXENT).activation(Activation.SOFTMAX) - .nIn(10).nOut(5).build()) - .build(); - - MultiLayerSpace mls = - new MultiLayerSpace.Builder() - .l1Bias(0.4) - .l2Bias(0.5) - .constrainBias(new NonNegativeConstraint()) - .updater(new Sgd(0.005)).seed(12345) - .addLayer(new DenseLayerSpace.Builder().l1Bias(new ContinuousParameterSpace(0,1)).nIn(10).nOut(10).build()) - .addLayer(new DenseLayerSpace.Builder().l2Bias(0.7).constrainBias(new UnitNormConstraint()).nIn(10).nOut(10).build()) - .addLayer(new OutputLayerSpace.Builder().lossFunction(LossFunction.MCXENT).activation(Activation.SOFTMAX) - .nIn(10).nOut(5).build()) - .build(); - - int nParams = mls.numParameters(); - assertEquals(1, nParams); - - //Assign numbers to each leaf ParameterSpace object (normally done by candidate generator - manual here for testing) - List noDuplicatesList = LeafUtils.getUniqueObjects(mls.collectLeaves()); - - //Second: assign each a number - int c = 0; - for (ParameterSpace ps : noDuplicatesList) { - int np = ps.numParameters(); - if (np == 1) { - ps.setIndices(c++); - } else { - int[] values = new int[np]; - for (int j = 0; j < np; j++) - values[c++] = j; - ps.setIndices(values); - } - } - MultiLayerConfiguration conf = mls.getValue(new double[] {0.6}).getMultiLayerConfiguration(); - - assertEquals(expected, conf); - } - - @Test - public void testILossFunctionGetsSet() { - ILossFunction lossFunction = new LossMCXENT(Nd4j.create(new float[] {1f, 2f}, new long[]{1,2})); - - MultiLayerConfiguration expected = - new NeuralNetConfiguration.Builder().updater(new Sgd(0.005)).seed(12345).list() - .layer(0, new DenseLayer.Builder().nIn(10).nOut(10).build()) - .layer(1, new DenseLayer.Builder().nIn(10).nOut(10).build()).layer(2, - new OutputLayer.Builder().lossFunction(lossFunction) - .activation(Activation.SOFTMAX).nIn(10).nOut(5).build()) - .build(); - - MultiLayerSpace mls = new MultiLayerSpace.Builder().updater(new Sgd(0.005)).seed(12345) - .addLayer(new DenseLayerSpace.Builder().nIn(10).nOut(10).build(), new FixedValue<>(2)) //2 identical layers - .addLayer(new OutputLayerSpace.Builder().iLossFunction(lossFunction).activation(Activation.SOFTMAX).nIn(10).nOut(5).build()) - .build(); - - int nParams = mls.numParameters(); - assertEquals(0, nParams); - - MultiLayerConfiguration conf = mls.getValue(new double[0]).getMultiLayerConfiguration(); - - assertEquals(expected, conf); - } - - @Test - public void testBasic2() { - - MultiLayerSpace mls = - new MultiLayerSpace.Builder().updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.2, 0.5)) - .convolutionMode(ConvolutionMode.Same) - .addLayer(new ConvolutionLayerSpace.Builder().nIn(3).nOut(3).kernelSize(2, 2) - .stride(1, 1).build()) - .addLayer(new DenseLayerSpace.Builder().nIn(10).nOut(10) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.TANH)) - .build(), new IntegerParameterSpace(1, 3)) //1-3 identical layers - .addLayer(new OutputLayerSpace.Builder().nIn(10).nOut(10) - .activation(Activation.SOFTMAX).build()) - .build(); - - int nParams = mls.numParameters(); - assertEquals(4, nParams); - - //Assign numbers to each leaf ParameterSpace object (normally done by candidate generator - manual here for testing) - List noDuplicatesList = LeafUtils.getUniqueObjects(mls.collectLeaves()); - - //Second: assign each a number - int c = 0; - for (ParameterSpace ps : noDuplicatesList) { - int np = ps.numParameters(); - if (np == 1) { - ps.setIndices(c++); - } else { - int[] values = new int[np]; - for (int j = 0; j < np; j++) - values[c++] = j; - ps.setIndices(values); - } - } - - - int[] nLayerCounts = new int[3]; - int reluCount = 0; - int tanhCount = 0; - - Random r = new Random(12345); - - for (int i = 0; i < 50; i++) { - - double[] rvs = new double[nParams]; - for (int j = 0; j < rvs.length; j++) - rvs[j] = r.nextDouble(); - - - MultiLayerConfiguration conf = mls.getValue(rvs).getMultiLayerConfiguration(); - - int nLayers = conf.getConfs().size(); - assertTrue(nLayers >= 3 && nLayers <= 5); //1 conv + 1-3 dense layers + 1 output layer: 2 to 4 - - int nLayersExOutputLayer = nLayers - 1; - nLayerCounts[nLayersExOutputLayer - 2]++; - - for (int j = 0; j < nLayers; j++) { - NeuralNetConfiguration layerConf = conf.getConf(j); - - double lr = ((Sgd)((BaseLayer) layerConf.getLayer()).getIUpdater()).getLearningRate(); - assertTrue(lr >= 0.0001 && lr <= 0.1); - double l2 = TestUtils.getL2((BaseLayer) layerConf.getLayer()); - assertTrue(l2 >= 0.2 && l2 <= 0.5); - - if (j == nLayers - 1) { //Output layer - assertEquals(Activation.SOFTMAX.getActivationFunction(), ((BaseLayer) layerConf.getLayer()).getActivationFn()); - } else if (j == 0) { - //Conv layer - ConvolutionLayer cl = (ConvolutionLayer) layerConf.getLayer(); - assertEquals(3, cl.getNIn()); - assertEquals(3, cl.getNOut()); - assertEquals(ConvolutionMode.Same, cl.getConvolutionMode()); - } else { - IActivation actFn = ((BaseLayer) layerConf.getLayer()).getActivationFn(); - assertTrue(Activation.RELU.getActivationFunction().equals(actFn) || - Activation.TANH.getActivationFunction().equals(actFn)); - if (Activation.RELU.getActivationFunction().equals(actFn)) - reluCount++; - else - tanhCount++; - } - } - } - - for (int i = 0; i < 3; i++) { - assertTrue(nLayerCounts[i] >= 5); //Expect approx equal (50/3 each), but some variation randomly - } - -// System.out.println("Number of layers: " + Arrays.toString(nLayerCounts)); -// System.out.println("ReLU vs. Tanh: " + reluCount + "\t" + tanhCount); - - } - - @Test - public void testGlobalPoolingBasic() { - - MultiLayerConfiguration expected = new NeuralNetConfiguration.Builder().updater(new Sgd(0.005)).seed(12345).list() - .layer(0, new GravesLSTM.Builder().nIn(10).nOut(10).build()) - .layer(1, new GlobalPoolingLayer.Builder().poolingType(PoolingType.SUM).pnorm(7).build()) - .layer(2, new OutputLayer.Builder().lossFunction(LossFunction.MCXENT).activation(Activation.SOFTMAX).nIn(10).nOut(5).build()) - .build(); - - MultiLayerSpace mls = - new MultiLayerSpace.Builder().updater(new Sgd(0.005)).seed(12345) - .addLayer(new GravesLSTMLayerSpace.Builder().nIn(10).nOut(10).build()) - .addLayer(new GlobalPoolingLayerSpace.Builder().poolingType(PoolingType.SUM) - .pNorm(7).build()) - .addLayer(new OutputLayerSpace.Builder().lossFunction(LossFunction.MCXENT) - .activation(Activation.SOFTMAX) - .nIn(10).nOut(5).build()) - .build(); - - int nParams = mls.numParameters(); - assertEquals(0, nParams); - - MultiLayerConfiguration conf = mls.getValue(new double[0]).getMultiLayerConfiguration(); - - assertEquals(expected, conf); - } - - - @Test - public void testVariationalAutoencoderLayerSpaceBasic() { - MultiLayerSpace mls = - new MultiLayerSpace.Builder() - .updater(new Sgd(0.005)).seed( - 12345) - .addLayer(new VariationalAutoencoderLayerSpace.Builder() - .nIn(new IntegerParameterSpace(50, 75)).nOut(200) - .encoderLayerSizes(234, 567).decoderLayerSizes(123, 456) - .reconstructionDistribution( - new DiscreteParameterSpace( - new GaussianReconstructionDistribution(), - new BernoulliReconstructionDistribution())) - .build()) - .build(); - - int numParams = mls.numParameters(); - - //Assign numbers to each leaf ParameterSpace object (normally done by candidate generator - manual here for testing) - List noDuplicatesList = LeafUtils.getUniqueObjects(mls.collectLeaves()); - - //Second: assign each a number - int c = 0; - for (ParameterSpace ps : noDuplicatesList) { - int np = ps.numParameters(); - if (np == 1) { - ps.setIndices(c++); - } else { - int[] values = new int[np]; - for (int j = 0; j < np; j++) - values[c++] = j; - ps.setIndices(values); - } - } - - double[] zeros = new double[numParams]; - - DL4JConfiguration configuration = mls.getValue(zeros); - - MultiLayerConfiguration conf = configuration.getMultiLayerConfiguration(); - assertEquals(1, conf.getConfs().size()); - - NeuralNetConfiguration nnc = conf.getConf(0); - VariationalAutoencoder vae = (VariationalAutoencoder) nnc.getLayer(); - - assertEquals(50, vae.getNIn()); - assertEquals(200, vae.getNOut()); - - assertArrayEquals(new int[] {234, 567}, vae.getEncoderLayerSizes()); - assertArrayEquals(new int[] {123, 456}, vae.getDecoderLayerSizes()); - - assertTrue(vae.getOutputDistribution() instanceof GaussianReconstructionDistribution); - - - - double[] ones = new double[numParams]; - Arrays.fill(ones, 1.0); - - configuration = mls.getValue(ones); - - conf = configuration.getMultiLayerConfiguration(); - assertEquals(1, conf.getConfs().size()); - - nnc = conf.getConf(0); - vae = (VariationalAutoencoder) nnc.getLayer(); - - assertEquals(75, vae.getNIn()); - assertEquals(200, vae.getNOut()); - - assertArrayEquals(new int[] {234, 567}, vae.getEncoderLayerSizes()); - assertArrayEquals(new int[] {123, 456}, vae.getDecoderLayerSizes()); - - assertTrue(vae.getOutputDistribution() instanceof BernoulliReconstructionDistribution); - } - - @Test - public void testInputTypeBasic() throws Exception { - - ParameterSpace layerSizeHyperparam = new IntegerParameterSpace(20, 60); - - MultiLayerSpace hyperparameterSpace = new MultiLayerSpace.Builder().l2(0.0001) - .weightInit(WeightInit.XAVIER).updater(new Nesterovs()) - .addLayer(new ConvolutionLayerSpace.Builder().kernelSize(5, 5).nIn(1).stride(1, 1) - .nOut(layerSizeHyperparam).activation(Activation.IDENTITY).build()) - .addLayer(new SubsamplingLayerSpace.Builder().poolingType(SubsamplingLayer.PoolingType.MAX) - .kernelSize(2, 2).stride(2, 2).build()) - .addLayer(new ConvolutionLayerSpace.Builder().kernelSize(5, 5) - //Note that nIn need not be specified in later layers - .stride(1, 1).nOut(50).activation(Activation.IDENTITY).build()) - .addLayer(new SubsamplingLayerSpace.Builder().poolingType(SubsamplingLayer.PoolingType.MAX) - .kernelSize(2, 2).stride(2, 2).build()) - .addLayer(new DenseLayerSpace.Builder().activation(Activation.RELU).nOut(500).build()) - .addLayer(new OutputLayerSpace.Builder() - .lossFunction(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD).nOut(10) - .activation(Activation.SOFTMAX).build()) - .setInputType(InputType.convolutionalFlat(28, 28, 1)).build(); - - - DataProvider dataProvider = new TestDataSetProvider(); - - File f = testDir.newFolder(); - if (f.exists()) - f.delete(); - f.mkdir(); - ResultSaver modelSaver = new FileModelSaver(f.getAbsolutePath()); - - ScoreFunction scoreFunction = new TestSetAccuracyScoreFunction(); - - int maxCandidates = 4; - TerminationCondition[] terminationConditions; - terminationConditions = new TerminationCondition[] {new MaxCandidatesCondition(maxCandidates)}; - - //Given these configuration options, let's put them all together: - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(new RandomSearchGenerator(hyperparameterSpace, null)) - .dataProvider(dataProvider).modelSaver(modelSaver).scoreFunction(scoreFunction) - .terminationConditions(terminationConditions).build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - runner.execute(); - - assertEquals(maxCandidates, runner.getResults().size()); - } - - - @Test - public void testSameRanges() { - - ParameterSpace l1Hyperparam = new ContinuousParameterSpace(0.001, 0.1); - ParameterSpace l2Hyperparam = new ContinuousParameterSpace(0.001, 0.1); - - MultiLayerSpace hyperparameterSpace = - new MultiLayerSpace.Builder().addLayer(new DenseLayerSpace.Builder().nIn(10).nOut(10).build()) - .l1(l1Hyperparam).l2(l2Hyperparam).build(); - - CandidateGenerator c = new RandomSearchGenerator(hyperparameterSpace, null); - - Candidate candidate = c.getCandidate(); - } - - @Test - public void testWeightedLossFunction() { - - MultiLayerConfiguration expected = - new NeuralNetConfiguration.Builder().updater(new Sgd(0.005)).seed(12345).list() - .layer(0, new DenseLayer.Builder().nIn(10).nOut(10).build()) - .layer(1, new DenseLayer.Builder().nIn(10).nOut(10).build()).layer(2, - new OutputLayer.Builder() - .lossFunction(new LossMSE(Nd4j.create( - new double[] {1, 2, 3, 4, 5}, new long[]{1,5}))) - .nIn(10).nOut(5).build()) - .build(); - - MultiLayerSpace mls = - new MultiLayerSpace.Builder().updater(new Sgd(0.005)).seed(12345) - .addLayer(new DenseLayerSpace.Builder().nIn(10).nOut(10).build(), - new FixedValue<>(2)) //2 identical layers - .addLayer(new OutputLayerSpace.Builder() - .iLossFunction(new LossMSE(Nd4j.create(new double[] {1, 2, 3, 4, 5}, new long[]{1,5}))) - .nIn(10).nOut(5).build()) - .build(); - - int nParams = mls.numParameters(); - assertEquals(0, nParams); - - MultiLayerConfiguration conf = mls.getValue(new double[0]).getMultiLayerConfiguration(); - - assertEquals(expected, conf); - - String json = mls.toJson(); - MultiLayerSpace fromJson = MultiLayerSpace.fromJson(json); - - assertEquals(mls, fromJson); - } - - - @Test - public void testBidirectional() throws Exception { - - MultiLayerSpace mls = - new MultiLayerSpace.Builder().updater(new Sgd(0.005)) - .seed(12345) - .layer(new Bidirectional(new LSTMLayerSpace.Builder() - .nIn(10).nOut(10).build())) - .build(); - - DL4JConfiguration conf = mls.getValue(new double[0]); - MultiLayerConfiguration c2 = conf.getMultiLayerConfiguration(); - - MultiLayerNetwork net = new MultiLayerNetwork(c2); - net.init(); - - assertEquals(1, net.getnLayers()); - assertTrue(net.getLayer(0) instanceof BidirectionalLayer); - BidirectionalLayer bl = (BidirectionalLayer)net.getLayer(0); - - Field f = BidirectionalLayer.class.getDeclaredField("fwd"); - Field b = BidirectionalLayer.class.getDeclaredField("bwd"); - f.setAccessible(true); - b.setAccessible(true); - org.deeplearning4j.nn.layers.recurrent.LSTM lstmFwd = (org.deeplearning4j.nn.layers.recurrent.LSTM) f.get(bl); - org.deeplearning4j.nn.layers.recurrent.LSTM lstmBwd = (org.deeplearning4j.nn.layers.recurrent.LSTM) b.get(bl); - - assertEquals(10, ((LSTM)lstmFwd.conf().getLayer()).getNIn()); - assertEquals(10, ((LSTM)lstmFwd.conf().getLayer()).getNOut()); - assertEquals(10, ((LSTM)lstmBwd.conf().getLayer()).getNIn()); - assertEquals(10, ((LSTM)lstmBwd.conf().getLayer()).getNOut()); - } - - - @Test - public void testMathOps() { - - ParameterSpace firstLayerSize = new IntegerParameterSpace(10,30); - ParameterSpace secondLayerSize = new MathOp<>(firstLayerSize, Op.MUL, 3); - ParameterSpace firstLayerLR = new ContinuousParameterSpace(0.01, 0.1); - ParameterSpace secondLayerLR = new MathOp<>(firstLayerLR, Op.ADD, 0.2); - - MultiLayerSpace mls = - new MultiLayerSpace.Builder().updater(new Sgd(0.005)) - .seed(12345) - .layer(new DenseLayerSpace.Builder().nOut(firstLayerSize) - .updater(new AdamSpace(firstLayerLR)) - .build()) - .layer(new OutputLayerSpace.Builder().nOut(secondLayerSize) - .updater(new AdamSpace(secondLayerLR)) - .activation(Activation.SOFTMAX) - .build()) - .setInputType(InputType.feedForward(10)) - .build(); - - int nParams = mls.numParameters(); - assertEquals(2, nParams); - - new RandomSearchGenerator(mls, null); //Initializes the indices - - Random r = new Random(12345); - for( int i=0; i<10; i++ ){ - double[] d = new double[nParams]; - for( int j=0; j dropout = new DiscreteParameterSpace<>(0.0, 0.5); - - MultiLayerSpace mls = - new MultiLayerSpace.Builder().updater(new Sgd(0.005)) - .dropOut(dropout) - .seed(12345) - .layer(new DenseLayerSpace.Builder().nOut(10) - .build()) - .layer(new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .build()) - .setInputType(InputType.feedForward(10)) - .build(); - - int nParams = mls.numParameters(); - assertEquals(1, nParams); - - new RandomSearchGenerator(mls, null); //Initializes the indices - - Random r = new Random(12345); - int countNull = 0; - int count05 = 0; - for( int i=0; i<10; i++ ){ - double[] d = new double[nParams]; - for( int j=0; j 0); - assertTrue(count05 > 0); - } - - - private static class TestDataSetProvider implements DataProvider { - - @Override - public Object trainData(Map dataParameters) { - return new ExistingDataSetIterator( - Collections.singletonList(new DataSet(Nd4j.create(1, 1, 28, 28), Nd4j.create(1,10)))); - } - - @Override - public Object testData(Map dataParameters) { - return new ExistingDataSetIterator( - Collections.singletonList(new DataSet(Nd4j.create(1, 1, 28, 28), Nd4j.create(1,10)))); - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - } - - - @Test - public void testDropout(){ - - MultiLayerSpace mls = new MultiLayerSpace.Builder().updater(new Sgd(0.005)).seed(12345) - .addLayer(new ConvolutionLayerSpace.Builder().nOut(2) - .dropOut(new ContinuousParameterSpace(0.4,0.6)) - .build()) - .addLayer(new GlobalPoolingLayerSpace.Builder().dropOut(new ContinuousParameterSpace(0.4,0.6)).build()) - .addLayer(new OutputLayerSpace.Builder().activation(Activation.SOFTMAX).nIn(10).nOut(5).build()) - .setInputType(InputType.convolutional(28, 28, 1)) - .build(); - - int nParams = mls.numParameters(); - List l = LeafUtils.getUniqueObjects(mls.collectLeaves()); - int x=0; - for( ParameterSpace p : l){ - int n = p.numParameters(); - int[] arr = new int[n]; - for(int i=0; i l = LeafUtils.getUniqueObjects(mls.collectLeaves()); - int x=0; - for( ParameterSpace p : l){ - int n = p.numParameters(); - int[] arr = new int[n]; - for(int i=0; i learningRateHyperparam = new DiscreteParameterSpace<>(0.003, 0.005, 0.01, 0.05); - ParameterSpace layerSizeHyperparam1 = new DiscreteParameterSpace<>(32, 64, 96, 128); - ParameterSpace layerSizeHyperparam2 = new DiscreteParameterSpace<>(32, 64, 96, 128); - ParameterSpace dropoutHyperparam = new DiscreteParameterSpace<>(0.8, 0.9); - - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .updater(new AdamSpace(learningRateHyperparam)) - .weightInit(WeightInit.XAVIER) - .l2(0.0001) - .addLayer(new DenseLayerSpace.Builder() - .nIn(10) - .nOut(layerSizeHyperparam1) - .build()) - .addLayer(new BatchNormalizationSpace.Builder() - .nOut(layerSizeHyperparam1) - .activation(Activation.RELU) - .build()) - .addLayer(new DropoutLayerSpace.Builder() - .dropOut(dropoutHyperparam) - .build()) - .addLayer(new DenseLayerSpace.Builder() - .nOut(layerSizeHyperparam2) - .build()) - .addLayer(new BatchNormalizationSpace.Builder() - .nOut(layerSizeHyperparam2) - .activation(Activation.RELU) - .build()) - .addLayer(new DropoutLayerSpace.Builder() - .dropOut(dropoutHyperparam) - .build()) - .addLayer(new OutputLayerSpace.Builder() - .nOut(10) - .activation(Activation.SOFTMAX) - .lossFunction(LossFunction.MCXENT) - .build()) - .build(); - - assertEquals(4, mls.getNumParameters()); - - for( int discreteCount : new int[]{1, 5}) { - GridSearchCandidateGenerator generator = new GridSearchCandidateGenerator(mls, discreteCount, GridSearchCandidateGenerator.Mode.Sequential, null); - - int expCandidates = 4 * 4 * 4 * 2; - assertEquals(expCandidates, generator.getTotalNumCandidates()); - - int count = 0; - while (generator.hasMoreCandidates()) { - generator.getCandidate(); - count++; - } - - - assertEquals(expCandidates, count); - } - } - - - @Test - public void testGridCandidateGenerator(){ - ParameterSpace layerSizeParam = new DiscreteParameterSpace<>(32, 48, 64); - ParameterSpace learningRateParam = new DiscreteParameterSpace<>(0.005, 0.007, 0.01); - - MultiLayerSpace hyperParamaterSpace = new MultiLayerSpace.Builder() - .seed(12345) - .biasInit(1) - .l2(1e-4) - .updater(new NesterovsSpace(learningRateParam)) - .addLayer(new DenseLayerSpace.Builder().nIn(10).nOut(layerSizeParam) - .weightInit(WeightInit.XAVIER) - .activation(Activation.RELU) - .build()) - .addLayer(new DenseLayerSpace.Builder().nIn(layerSizeParam).nOut(layerSizeParam) - .weightInit(WeightInit.XAVIER) - .activation(Activation.RELU) - .build()) - .addLayer(new OutputLayerSpace.Builder() - .lossFunction(LossFunctions.LossFunction.MSE) - .weightInit(WeightInit.XAVIER) - .activation(Activation.SOFTMAX) - .nIn(layerSizeParam).nOut(10).build()) - .build(); - - CandidateGenerator candidateGenerator = new GridSearchCandidateGenerator(hyperParamaterSpace, 30, GridSearchCandidateGenerator.Mode.Sequential, null); -// CandidateGenerator candidateGenerator = new RandomSearchGenerator(hyperParamaterSpace); - - Set> expCandidates = new HashSet<>(); - for(Double d : new double[]{0.005, 0.007, 0.01}){ - for(int i : new int[]{32, 48, 64}){ - expCandidates.add(new Pair<>(d, i)); - } - } - - Set> actCandidates = new HashSet<>(); - while(candidateGenerator.hasMoreCandidates()) { - Candidate conf = candidateGenerator.getCandidate(); - MultiLayerConfiguration mlc = conf.getValue().getMultiLayerConfiguration(); - FeedForwardLayer ffl = ((FeedForwardLayer) mlc.getConf(0).getLayer()); - actCandidates.add(new Pair<>(ffl.getIUpdater().getLearningRate(0,0), (int)ffl.getNOut())); - } - - assertEquals(expCandidates, actCandidates); - } -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestScoreFunctions.java b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestScoreFunctions.java deleted file mode 100644 index e3417f13b..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/java/org/deeplearning4j/arbiter/multilayernetwork/TestScoreFunctions.java +++ /dev/null @@ -1,222 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.multilayernetwork; - -import lombok.AllArgsConstructor; -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.conf.updater.AdamSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.saving.InMemoryResultSaver; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultReference; -import org.deeplearning4j.arbiter.optimize.api.saving.ResultSaver; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.scoring.impl.ROCScoreFunction; -import org.deeplearning4j.arbiter.task.MultiLayerNetworkTaskCreator; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.deeplearning4j.eval.ROC; -import org.deeplearning4j.eval.ROCBinary; -import org.deeplearning4j.eval.ROCMultiClass; -import org.deeplearning4j.nn.conf.WorkspaceMode; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.deeplearning4j.nn.weights.WeightInit; -import org.junit.Test; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.api.DataSet; -import org.nd4j.linalg.dataset.api.DataSetPreProcessor; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -import java.io.IOException; -import java.util.List; -import java.util.Map; - -import static org.junit.Assert.assertEquals; - -@Slf4j -public class TestScoreFunctions extends BaseDL4JTest { - - - @Override - public long getTimeoutMilliseconds() { - return 60000L; - } - - @Test - public void testROCScoreFunctions() throws Exception { - - - for (boolean auc : new boolean[]{true, false}) { - for (ROCScoreFunction.ROCType rocType : ROCScoreFunction.ROCType.values()) { - String msg = (auc ? "AUC" : "AUPRC") + " - " + rocType; - log.info("Starting: " + msg); - - ParameterSpace lr = new ContinuousParameterSpace(1e-5, 1e-3); - - int nOut = (rocType == ROCScoreFunction.ROCType.ROC ? 2 : 10); - LossFunctions.LossFunction lf = (rocType == ROCScoreFunction.ROCType.BINARY ? - LossFunctions.LossFunction.XENT : LossFunctions.LossFunction.MCXENT); - Activation a = (rocType == ROCScoreFunction.ROCType.BINARY ? Activation.SIGMOID : Activation.SOFTMAX); - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .trainingWorkspaceMode(WorkspaceMode.NONE) - .inferenceWorkspaceMode(WorkspaceMode.NONE) - .updater(new AdamSpace(lr)) - .weightInit(WeightInit.XAVIER) - .layer(new OutputLayerSpace.Builder().nIn(784).nOut(nOut) - .activation(a) - .lossFunction(lf).build()) - .build(); - - CandidateGenerator cg = new RandomSearchGenerator(mls); - ResultSaver rs = new InMemoryResultSaver(); - ScoreFunction sf = new ROCScoreFunction(rocType, (auc ? ROCScoreFunction.Metric.AUC : ROCScoreFunction.Metric.AUPRC)); - - - OptimizationConfiguration oc = new OptimizationConfiguration.Builder() - .candidateGenerator(cg) - .dataProvider(new DP(rocType)) - .modelSaver(rs) - .scoreFunction(sf) - .terminationConditions(new MaxCandidatesCondition(3)) - .rngSeed(12345) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(oc, new MultiLayerNetworkTaskCreator()); - runner.execute(); - - List list = runner.getResults(); - - for (ResultReference rr : list) { - DataSetIterator testIter = new MnistDataSetIterator(4, 16, false, false, false, 12345); - testIter.setPreProcessor(new PreProc(rocType)); - - OptimizationResult or = rr.getResult(); - MultiLayerNetwork net = (MultiLayerNetwork) or.getResultReference().getResultModel(); - - double expScore; - switch (rocType){ - case ROC: - if(auc){ - expScore = net.doEvaluation(testIter, new ROC())[0].calculateAUC(); - } else { - expScore = net.doEvaluation(testIter, new ROC())[0].calculateAUCPR(); - } - break; - case BINARY: - if(auc){ - expScore = net.doEvaluation(testIter, new ROCBinary())[0].calculateAverageAuc(); - } else { - expScore = net.doEvaluation(testIter, new ROCBinary())[0].calculateAverageAUCPR(); - } - break; - case MULTICLASS: - if(auc){ - expScore = net.doEvaluation(testIter, new ROCMultiClass())[0].calculateAverageAUC(); - } else { - expScore = net.doEvaluation(testIter, new ROCMultiClass())[0].calculateAverageAUCPR(); - } - break; - default: - throw new RuntimeException(); - } - - - DataSetIterator iter = new MnistDataSetIterator(4, 16, false, false, false, 12345); - iter.setPreProcessor(new PreProc(rocType)); - - assertEquals(msg, expScore, or.getScore(), 1e-4); - } - } - } - } - - @AllArgsConstructor - public static class DP implements DataProvider { - - protected ROCScoreFunction.ROCType rocType; - - @Override - public Object trainData(Map dataParameters) { - try { - DataSetIterator iter = new MnistDataSetIterator(4, 16, false, false, false, 12345); - iter.setPreProcessor(new PreProc(rocType)); - return iter; - } catch (IOException e){ - throw new RuntimeException(e); - } - } - - @Override - public Object testData(Map dataParameters) { - try { - DataSetIterator iter = new MnistDataSetIterator(4, 16, false, false, false, 12345); - iter.setPreProcessor(new PreProc(rocType)); - return iter; - } catch (IOException e){ - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - } - - @AllArgsConstructor - public static class PreProc implements DataSetPreProcessor { - protected ROCScoreFunction.ROCType rocType; - - @Override - public void preProcess(DataSet toPreProcess) { - switch (rocType){ - case ROC: - //Convert to binary - long mb = toPreProcess.getLabels().size(0); - INDArray argMax = Nd4j.argMax(toPreProcess.getLabels(), 1); - INDArray newLabel = Nd4j.create(mb, 2); - for( int i=0; i dataParameters) { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(batchSize, true, 12345), terminationIter); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - - @Override - public Object testData(Map dataParameters) { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(batchSize, false, 12345), terminationIter); - } catch (Exception e){ - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - - -} diff --git a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/resources/logback.xml b/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/resources/logback.xml deleted file mode 100644 index bc7ffbbb5..000000000 --- a/contrib/attic/arbiter/arbiter-deeplearning4j/src/test/resources/logback.xml +++ /dev/null @@ -1,55 +0,0 @@ - - - - - - logs/application.log - - %date - [%level] - from %logger in %thread - %n%message%n%xException%n - - - - - - %logger{15} - %message%n%xException{5} - - - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/contrib/attic/arbiter/arbiter-server/pom.xml b/contrib/attic/arbiter/arbiter-server/pom.xml deleted file mode 100644 index cc6879397..000000000 --- a/contrib/attic/arbiter/arbiter-server/pom.xml +++ /dev/null @@ -1,58 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - arbiter - 1.0.0-SNAPSHOT - - - arbiter-server - - arbiter-server - - - - org.deeplearning4j - arbiter-deeplearning4j - ${project.version} - - - com.beust - jcommander - - - - - - test-nd4j-native - - - test-nd4j-cuda-11.0 - - - diff --git a/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/ArbiterCliGenerator.java b/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/ArbiterCliGenerator.java deleted file mode 100644 index d39eb28a6..000000000 --- a/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/ArbiterCliGenerator.java +++ /dev/null @@ -1,286 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.server; - -import com.beust.jcommander.JCommander; -import com.beust.jcommander.Parameter; -import com.beust.jcommander.ParameterException; -import org.apache.commons.io.FileUtils; -import org.deeplearning4j.arbiter.ComputationGraphSpace; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxTimeCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.TerminationCondition; -import org.deeplearning4j.arbiter.optimize.generator.GridSearchCandidateGenerator; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.RegressionValue; -import org.deeplearning4j.arbiter.scoring.ScoreFunctions; - -import java.io.File; -import java.util.ArrayList; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.concurrent.TimeUnit; - -/** - * Generate an {@link OptimizationConfiguration} - * via the command line interface. - * You can then use this configuration json file from - * {@link ArbiterCliRunner} - * - * @author Adam Gibson - */ -public class ArbiterCliGenerator { - @Parameter(names = {"--searchSpacePath"}) - private String searchSpacePath = null; - @Parameter(names = {"--candidateType"},required = true) - private String candidateType = null; - @Parameter(names = {"--discretizationCount"}) - private int discretizationCount = 5; - @Parameter(names = {"--gridSearchOrder"}) - private String gridSearchOrder = null; - @Parameter(names = {"--neuralNetType"},required = true) - private String neuralNetType = null; - @Parameter(names = {"--dataSetIteratorClass"},required = true) - private String dataSetIteratorClass = null; - @Parameter(names = {"--modelOutputPath"},required = true) - private String modelOutputPath = null; - @Parameter(names = {"--score"},required = true) - private String score = null; - @Parameter(names = {"--problemType"},required = true) - private String problemType = CLASSIFICIATION; - @Parameter(names = {"--configSavePath"},required = true) - private String configSavePath = null; - - @Parameter(names = {"--duration"},description = "The number of minutes to run for. Default is -1 which means run till convergence.") - private long duration = -1; - @Parameter(names = {"--numCandidates"},description = "The number of candidates to generate. Default is 1.") - private int numCandidates = 1; - - public final static String REGRESSION_MULTI = "regression"; - public final static String REGRESSION = "regression"; - public final static String CLASSIFICIATION = "classification"; - - public final static String RANDOM_CANDIDATE = "random"; - public final static String GRID_SEARCH_CANDIDATE = "gridsearch"; - - public final static String SEQUENTIAL_ORDER = "sequence"; - public final static String RANDOM_ORDER = "random"; - - public final static String COMP_GRAPH = "compgraph"; - public final static String MULTI_LAYER = "multilayer"; - - public final static String ACCURACY = "accuracy"; - public final static String F1 = "f1"; - - public final static String ACCURACY_MULTI = "accuracy_multi"; - public final static String F1_MULTI = "f1_multi"; - - - public final static String REGRESSION_SCORE = "regression_score"; - public final static String REGRESSION_SCORE_MULTI = "regression_score_multi"; - - public void runMain(String...args) throws Exception { - JCommander jcmdr = new JCommander(this); - - try { - jcmdr.parse(args); - } catch(ParameterException e) { - System.err.println(e.getMessage()); - //User provides invalid input -> print the usage info - jcmdr.usage(); - try{ Thread.sleep(500); } catch(Exception e2){ } - System.exit(1); - } - - - DataProvider dataProvider = new DataSetIteratorFactoryProvider(); - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY,dataSetIteratorClass); - - - if(neuralNetType.equals(MULTI_LAYER)) { - MultiLayerSpace multiLayerSpace = loadMultiLayer(); - CandidateGenerator candidateGenerator = null; - if(candidateType.equals(GRID_SEARCH_CANDIDATE)) { - candidateGenerator = new RandomSearchGenerator(multiLayerSpace,commands); - - - - } - else if(candidateType.equals(RANDOM_CANDIDATE)) { - candidateGenerator = new RandomSearchGenerator(multiLayerSpace,commands); - - } - - if(problemType.equals(CLASSIFICIATION)) { - OptimizationConfiguration configuration - = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelOutputPath)) - .scoreFunction(scoreFunctionMultiLayerNetwork()) - .terminationConditions(getConditions()) - .build(); - FileUtils.writeStringToFile(new File(configSavePath),configuration.toJson()); - - } - else if(problemType.equals(REGRESSION)) { - OptimizationConfiguration configuration - = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelOutputPath)) - .scoreFunction(scoreFunctionMultiLayerNetwork()) - .terminationConditions(getConditions()) - .build(); - FileUtils.writeStringToFile(new File(configSavePath),configuration.toJson()); - - } - - - } - else if(neuralNetType.equals(COMP_GRAPH)) { - ComputationGraphSpace computationGraphSpace = loadCompGraph(); - CandidateGenerator candidateGenerator = null; - if(candidateType.equals(GRID_SEARCH_CANDIDATE)) { - candidateGenerator = new RandomSearchGenerator(computationGraphSpace,commands); - - } - else if(candidateType.equals(RANDOM_CANDIDATE)) { - candidateGenerator = new RandomSearchGenerator(computationGraphSpace,commands); - - } - - - if(problemType.equals(CLASSIFICIATION)) { - OptimizationConfiguration configuration - = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelOutputPath)) - .scoreFunction(scoreFunctionCompGraph()) - .terminationConditions(getConditions()) - .build(); - - FileUtils.writeStringToFile(new File(configSavePath),configuration.toJson()); - } - else { - OptimizationConfiguration configuration - = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelOutputPath)) - .scoreFunction(scoreFunctionCompGraph()) - .terminationConditions(getConditions()) - .build(); - FileUtils.writeStringToFile(new File(configSavePath),configuration.toJson()); - - - } - - - } - - - } - - public static void main(String...args) throws Exception { - new ArbiterCliGenerator().runMain(args); - } - - private List getConditions() { - List ret = new ArrayList<>(); - if(duration > 0) { - ret.add(new MaxTimeCondition(duration,TimeUnit.MINUTES)); - } - - if(numCandidates > 0) - ret.add(new MaxCandidatesCondition(numCandidates)); - if(ret.isEmpty()) { - ret.add(new MaxCandidatesCondition(1)); - } - return ret; - } - - - private GridSearchCandidateGenerator.Mode getMode() { - if(gridSearchOrder.equals(RANDOM_ORDER)) - return GridSearchCandidateGenerator.Mode.RandomOrder; - else if(gridSearchOrder.equals(SEQUENTIAL_ORDER)) { - return GridSearchCandidateGenerator.Mode.Sequential; - } - else throw new IllegalArgumentException("Illegal mode " + gridSearchOrder); - } - - private ScoreFunction scoreFunctionCompGraph() { - if(problemType.equals(CLASSIFICIATION)) { - switch(score) { - case ACCURACY: return ScoreFunctions.testSetAccuracy(); - case F1: return ScoreFunctions.testSetF1(); - case F1_MULTI : return ScoreFunctions.testSetF1(); - case ACCURACY_MULTI: return ScoreFunctions.testSetAccuracy(); - - default: throw new IllegalArgumentException("Score " + score + " not valid for type " + problemType); - } - } - else if(problemType.equals(REGRESSION)) { - switch(score) { - case REGRESSION_SCORE: return ScoreFunctions.testSetRegression(RegressionValue.valueOf(score)); - case REGRESSION_SCORE_MULTI: return ScoreFunctions.testSetRegression(RegressionValue.valueOf(score)); - default: throw new IllegalArgumentException("Score " + score + " not valid for type " + problemType); - } - } - throw new IllegalStateException("Illegal problem type " + problemType); - } - - private ScoreFunction scoreFunctionMultiLayerNetwork() { - if(problemType.equals(CLASSIFICIATION)) { - switch(score) { - case ACCURACY: return ScoreFunctions.testSetAccuracy(); - case F1: return ScoreFunctions.testSetF1(); - - default: throw new IllegalArgumentException("Score " + score + " not valid for type " + problemType); - } - } - else if(problemType.equals(REGRESSION)) { - switch(score) { - case REGRESSION_SCORE: return ScoreFunctions.testSetRegression(RegressionValue.valueOf(score)); - default: throw new IllegalArgumentException("Score " + score + " not valid for type " + problemType); - - } - } - throw new IllegalStateException("Illegal problem type " + problemType); - } - - private ComputationGraphSpace loadCompGraph() throws Exception { - return ComputationGraphSpace.fromJson(FileUtils.readFileToString(new File(searchSpacePath))); - } - - private MultiLayerSpace loadMultiLayer() throws Exception { - return MultiLayerSpace.fromJson(FileUtils.readFileToString(new File(searchSpacePath))); - } -} diff --git a/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/ArbiterCliRunner.java b/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/ArbiterCliRunner.java deleted file mode 100644 index 8fc57d77e..000000000 --- a/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/ArbiterCliRunner.java +++ /dev/null @@ -1,154 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.server; - -import com.beust.jcommander.JCommander; -import com.beust.jcommander.Parameter; -import com.beust.jcommander.ParameterException; -import org.apache.commons.io.FileUtils; -import org.deeplearning4j.arbiter.evaluator.multilayer.ClassificationEvaluator; -import org.deeplearning4j.arbiter.evaluator.multilayer.RegressionDataEvaluator; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.scoring.RegressionValue; -import org.deeplearning4j.arbiter.server.cli.NeuralNetTypeValidator; -import org.deeplearning4j.arbiter.server.cli.ProblemTypeValidator; -import org.deeplearning4j.arbiter.task.ComputationGraphTaskCreator; -import org.deeplearning4j.arbiter.task.MultiLayerNetworkTaskCreator; - -import java.io.File; -import java.util.HashMap; -import java.util.Map; - -/** - * Options: - * --dataSetIteratorClass - --modelSavePath - Default: /tmp - * --neuralNetType - --optimizationConfigPath - --problemType - Default: classification - --regressionType - - - - @author Adam Gibson - */ -public class ArbiterCliRunner { - @Parameter(names = {"--modelSavePath"}) - private String modelSavePath = System.getProperty("java.io.tmpdir"); - @Parameter(names = {"--optimizationConfigPath"}) - private String optimizationConfigPath = null; - @Parameter(names = {"--problemType"},validateWith = ProblemTypeValidator.class) - private String problemType = CLASSIFICATION; - @Parameter(names = {"--regressionType"}) - private String regressionType = null; - @Parameter(names = {"--dataSetIteratorClass"},required = true) - private String dataSetIteratorClass = null; - @Parameter(names = {"--neuralNetType"},required = true,validateWith = NeuralNetTypeValidator.class) - private String neuralNetType = null; - - public final static String CLASSIFICATION = "classification"; - public final static String REGRESSION = "regression"; - - - public final static String COMP_GRAPH = "compgraph"; - public final static String MULTI_LAYER_NETWORK = "multilayernetwork"; - - public void runMain(String...args) throws Exception { - JCommander jcmdr = new JCommander(this); - - try { - jcmdr.parse(args); - } catch(ParameterException e) { - System.err.println(e.getMessage()); - //User provides invalid input -> print the usage info - jcmdr.usage(); - try{ Thread.sleep(500); } catch(Exception e2){ } - System.exit(1); - } - - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY,dataSetIteratorClass); - - File f = new File(modelSavePath); - - if(f.exists()) f.delete(); - f.mkdir(); - f.deleteOnExit(); - - if(problemType.equals(REGRESSION)) { - if(neuralNetType.equals(COMP_GRAPH)) { - OptimizationConfiguration configuration - = OptimizationConfiguration.fromJson( - FileUtils.readFileToString(new File(optimizationConfigPath))); - - IOptimizationRunner runner - = new LocalOptimizationRunner(configuration, new ComputationGraphTaskCreator( - new RegressionDataEvaluator(RegressionValue.valueOf(regressionType),commands))); - runner.execute(); - } - else if(neuralNetType.equals(MULTI_LAYER_NETWORK)) { - OptimizationConfiguration configuration = OptimizationConfiguration. - fromJson(FileUtils.readFileToString(new File(optimizationConfigPath))); - - IOptimizationRunner runner - = new LocalOptimizationRunner( - configuration, - new MultiLayerNetworkTaskCreator( - new RegressionDataEvaluator( - RegressionValue.valueOf(regressionType), - commands))); - runner.execute(); - } - } - - else if(problemType.equals(CLASSIFICATION)) { - if(neuralNetType.equals(COMP_GRAPH)) { - OptimizationConfiguration configuration - = OptimizationConfiguration.fromJson(FileUtils.readFileToString(new File(optimizationConfigPath))); - - IOptimizationRunner runner - = new LocalOptimizationRunner( - configuration,new ComputationGraphTaskCreator(new ClassificationEvaluator())); - - runner.execute(); - } - else if(neuralNetType.equals(MULTI_LAYER_NETWORK)) { - OptimizationConfiguration configuration = OptimizationConfiguration - .fromJson(FileUtils.readFileToString(new File(optimizationConfigPath))); - - IOptimizationRunner runner - = new LocalOptimizationRunner(configuration, - new MultiLayerNetworkTaskCreator( - new ClassificationEvaluator()) - ); - - runner.execute(); - } - } - } - public static void main(String...args) throws Exception { - new ArbiterCliRunner().runMain(args); - } - -} diff --git a/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/cli/NeuralNetTypeValidator.java b/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/cli/NeuralNetTypeValidator.java deleted file mode 100644 index 548def0fa..000000000 --- a/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/cli/NeuralNetTypeValidator.java +++ /dev/null @@ -1,43 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.server.cli; - -import com.beust.jcommander.IParameterValidator; -import com.beust.jcommander.ParameterException; -import org.deeplearning4j.arbiter.server.ArbiterCliRunner; - -/** - * Created by agibsonccc on 3/13/17. - */ -public class NeuralNetTypeValidator implements IParameterValidator { - /** - * Validate the parameter. - * - * @param name The name of the parameter (e.g. "-host"). - * @param value The value of the parameter that we need to validate - * @throws ParameterException Thrown if the value of the parameter is invalid. - */ - @Override - public void validate(String name, String value) throws ParameterException { - if(!value.equals(ArbiterCliRunner.MULTI_LAYER_NETWORK) || value.equals(ArbiterCliRunner.COMP_GRAPH)) { - throw new ParameterException("Neural net type can only be " + ArbiterCliRunner.COMP_GRAPH + " or " + ArbiterCliRunner.MULTI_LAYER_NETWORK); - - } - } -} diff --git a/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/cli/ProblemTypeValidator.java b/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/cli/ProblemTypeValidator.java deleted file mode 100644 index 1bc04a5d0..000000000 --- a/contrib/attic/arbiter/arbiter-server/src/main/java/org/deeplearning4j/arbiter/server/cli/ProblemTypeValidator.java +++ /dev/null @@ -1,43 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.server.cli; - -import com.beust.jcommander.IParameterValidator; -import com.beust.jcommander.ParameterException; -import org.deeplearning4j.arbiter.server.ArbiterCliGenerator; - -/** - * Created by agibsonccc on 3/13/17. - */ -public class ProblemTypeValidator implements IParameterValidator { - /** - * Validate the parameter. - * - * @param name The name of the parameter (e.g. "-host"). - * @param value The value of the parameter that we need to validate - * @throws ParameterException Thrown if the value of the parameter is invalid. - */ - @Override - public void validate(String name, String value) throws ParameterException { - if(!value.equals(ArbiterCliGenerator.REGRESSION) || value.equals(ArbiterCliGenerator.CLASSIFICIATION)) { - throw new ParameterException("Problem type can only be " + ArbiterCliGenerator.REGRESSION + " or " + ArbiterCliGenerator.CLASSIFICIATION); - - } - } -} diff --git a/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/ArbiterCLIRunnerTest.java b/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/ArbiterCLIRunnerTest.java deleted file mode 100644 index 57abeb65e..000000000 --- a/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/ArbiterCLIRunnerTest.java +++ /dev/null @@ -1,122 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.server; - -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.io.FileUtils; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSetIteratorFactoryProvider; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxTimeCondition; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.nn.api.OptimizationAlgorithm; -import org.junit.Test; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -import java.io.File; -import java.util.HashMap; -import java.util.Map; -import java.util.UUID; -import java.util.concurrent.TimeUnit; - -import static org.junit.Assert.assertEquals; - -/** - * Created by agibsonccc on 3/12/17. - */ -@Slf4j -public class ArbiterCLIRunnerTest extends BaseDL4JTest { - - @Override - public long getTimeoutMilliseconds() { - return 90000; - } - - @Test - public void testCliRunner() throws Exception { - ArbiterCliRunner cliRunner = new ArbiterCliRunner(); - - //Define: network config (hyperparameter space) - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.1))) - .l2(new ContinuousParameterSpace(0.0001, 0.01)) - .addLayer(new DenseLayerSpace.Builder().nIn(784).nOut(new IntegerParameterSpace(2,10)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build()) - .addLayer(new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .numEpochs(3).build(); - assertEquals(mls,MultiLayerSpace.fromJson(mls.toJson())); - //Define configuration: - Map commands = new HashMap<>(); - commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY,TestDataFactoryProviderMnist.class.getCanonicalName()); - - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls,commands); - DataProvider dataProvider = new DataSetIteratorFactoryProvider(); - - -// String modelSavePath = FilenameUtils.concat(System.getProperty("java.io.tmpdir"),"ArbiterDL4JTest/"); - String modelSavePath = new File(System.getProperty("java.io.tmpdir"),"ArbiterDL4JTest/").getAbsolutePath(); - File dir = new File(modelSavePath); - if(!dir.exists()) - dir.mkdirs(); - String configPath = System.getProperty("java.io.tmpdir") + File.separator + UUID.randomUUID().toString() + ".json"; - OptimizationConfiguration configuration - = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator) - .dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction()) - .terminationConditions(new MaxTimeCondition(30, TimeUnit.SECONDS), - new MaxCandidatesCondition(5)) - .build(); - assertEquals(configuration,OptimizationConfiguration.fromJson(configuration.toJson())); - - FileUtils.writeStringToFile(new File(configPath),configuration.toJson()); -// System.out.println(configuration.toJson()); - configuration.toJson(); - - log.info("Starting test"); - cliRunner.runMain( - "--dataSetIteratorClass", - TestDataFactoryProviderMnist.class.getCanonicalName(), - "--neuralNetType", - ArbiterCliRunner.MULTI_LAYER_NETWORK, - "--optimizationConfigPath", - configPath - ); - } - - - -} diff --git a/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/AssertTestsExtendBaseClass.java b/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/AssertTestsExtendBaseClass.java deleted file mode 100644 index db422ba65..000000000 --- a/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/AssertTestsExtendBaseClass.java +++ /dev/null @@ -1,52 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.arbiter.server; - -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.BaseDL4JTest; -import org.nd4j.common.tests.AbstractAssertTestsClass; - -import java.util.*; - -/** - * This class checks that all test classes (i.e., anything with one or more methods annotated with @Test) - * extends BaseDl4jTest - either directly or indirectly. - * Other than a small set of exceptions, all tests must extend this - * - * @author Alex Black - */ - -@Slf4j -public class AssertTestsExtendBaseClass extends AbstractAssertTestsClass { - - @Override - protected Set> getExclusions() { - //Set of classes that are exclusions to the rule (either run manually or have their own logging + timeouts) - return new HashSet<>(); - } - - @Override - protected String getPackageName() { - return "org.deeplearning4j.arbiter.server"; - } - - @Override - protected Class getBaseClass() { - return BaseDL4JTest.class; - } -} diff --git a/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/MnistDataSetIteratorFactory.java b/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/MnistDataSetIteratorFactory.java deleted file mode 100644 index b6cc8618c..000000000 --- a/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/MnistDataSetIteratorFactory.java +++ /dev/null @@ -1,45 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.server; - -import lombok.Data; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIteratorFactory; - -import java.io.IOException; - -/** - * Created by agibsonccc on 3/13/17. - */ -@Data -public class MnistDataSetIteratorFactory extends BaseDL4JTest implements DataSetIteratorFactory { - /** - * @return - */ - @Override - public DataSetIterator create() { - try { - return new MnistDataSetIterator(1000,1000); - } catch (IOException e) { - throw new RuntimeException(e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/TestDataFactoryProviderMnist.java b/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/TestDataFactoryProviderMnist.java deleted file mode 100644 index cfada444d..000000000 --- a/contrib/attic/arbiter/arbiter-server/src/test/java/org/deeplearning4j/arbiter/server/TestDataFactoryProviderMnist.java +++ /dev/null @@ -1,46 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.server; - -import lombok.AllArgsConstructor; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.datasets.iterator.EarlyTerminationDataSetIterator; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.dataset.api.iterator.DataSetIteratorFactory; - -@AllArgsConstructor -public class TestDataFactoryProviderMnist extends BaseDL4JTest implements DataSetIteratorFactory { - - private int batchSize; - private int terminationIter; - - public TestDataFactoryProviderMnist(){ - this(16, 10); - } - - @Override - public DataSetIterator create() { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(batchSize, true, 12345), terminationIter); - } catch (Exception e){ - throw new RuntimeException(e); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-ui/pom.xml b/contrib/attic/arbiter/arbiter-ui/pom.xml deleted file mode 100644 index c85ba52c3..000000000 --- a/contrib/attic/arbiter/arbiter-ui/pom.xml +++ /dev/null @@ -1,68 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - arbiter - 1.0.0-SNAPSHOT - - - arbiter-ui - - arbiter-ui - - - 1.8 - 1.8 - - - - - org.deeplearning4j - arbiter-core - ${project.version} - - - org.deeplearning4j - arbiter-deeplearning4j - ${project.version} - - - org.deeplearning4j - deeplearning4j-ui - - - - - - test-nd4j-native - - - test-nd4j-cuda-11.0 - - - diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/UpdateStatus.java b/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/UpdateStatus.java deleted file mode 100644 index c86f70903..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/UpdateStatus.java +++ /dev/null @@ -1,35 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.ui; - -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.EqualsAndHashCode; -import lombok.NoArgsConstructor; - -@AllArgsConstructor -@NoArgsConstructor -@EqualsAndHashCode -@Data -public class UpdateStatus { - - private long statusUpdateTime; - private long settingsUpdateTime; - private long resultsUpdateTime; -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/BaseJavaPersistable.java b/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/BaseJavaPersistable.java deleted file mode 100644 index 33367d03e..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/BaseJavaPersistable.java +++ /dev/null @@ -1,161 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.ui.data; - -import lombok.AllArgsConstructor; -import org.apache.commons.compress.utils.IOUtils; -import org.deeplearning4j.core.storage.Persistable; -import org.deeplearning4j.arbiter.ui.module.ArbiterModule; - -import java.io.*; -import java.lang.reflect.Field; -import java.lang.reflect.Modifier; -import java.nio.ByteBuffer; -import java.util.ArrayList; -import java.util.List; - -/** - * Common implementation - * - * @author Alex Black - */ -@AllArgsConstructor -public abstract class BaseJavaPersistable implements Persistable { - - private String sessionId; - private long timestamp; - - public BaseJavaPersistable(Builder builder){ - this.sessionId = builder.sessionId; - this.timestamp = builder.timestamp; - } - - protected BaseJavaPersistable(){ - //No-arg costructor for Pesistable encoding/decoding - } - - @Override - public String getTypeID() { - return ArbiterModule.ARBITER_UI_TYPE_ID; - } - - @Override - public long getTimeStamp() { - return timestamp; - } - - @Override - public String getSessionID() { - return sessionId; - } - - @Override - public int encodingLengthBytes() { - //TODO - presumably a more efficient way to do this - byte[] encoded = encode(); - return encoded.length; - } - - @Override - public byte[] encode() { - ByteArrayOutputStream baos = new ByteArrayOutputStream(); - try (ObjectOutputStream oos = new ObjectOutputStream(baos)) { - oos.writeObject(this); - } catch (IOException e) { - throw new RuntimeException(e); //Should never happen - } - return baos.toByteArray(); - } - - @Override - public void encode(ByteBuffer buffer) { - buffer.put(encode()); - } - - @Override - public void encode(OutputStream outputStream) throws IOException { - try (ObjectOutputStream oos = new ObjectOutputStream(outputStream)) { - oos.writeObject(this); - } - } - - @Override - public void decode(byte[] decode) { - BaseJavaPersistable r; - try (ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(decode))) { - r = (BaseJavaPersistable) ois.readObject(); - } catch (IOException | ClassNotFoundException e) { - throw new RuntimeException(e); //Should never happen - } - - //Need to manually build and walk the class heirarchy... - Class currClass = this.getClass(); - List> classHeirarchy = new ArrayList<>(); - while (currClass != Object.class) { - classHeirarchy.add(currClass); - currClass = currClass.getSuperclass(); - } - - for (int i = classHeirarchy.size() - 1; i >= 0; i--) { - //Use reflection here to avoid a mass of boilerplate code... - Field[] allFields = classHeirarchy.get(i).getDeclaredFields(); - - for (Field f : allFields) { - if (Modifier.isStatic(f.getModifiers())) { - //Skip static fields - continue; - } - f.setAccessible(true); - try { - f.set(this, f.get(r)); - } catch (IllegalAccessException e) { - throw new RuntimeException(e); //Should never happen - } - } - } - } - - @Override - public void decode(ByteBuffer buffer) { - byte[] bytes = new byte[buffer.remaining()]; - buffer.get(bytes); - decode(bytes); - } - - @Override - public void decode(InputStream inputStream) throws IOException { - decode(IOUtils.toByteArray(inputStream)); - } - - public static abstract class Builder> { - protected String sessionId; - protected long timestamp; - - public T sessionId(String sessionId){ - this.sessionId = sessionId; - return (T) this; - } - - public T timestamp(long timestamp){ - this.timestamp = timestamp; - return (T) this; - } - - } -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/GlobalConfigPersistable.java b/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/GlobalConfigPersistable.java deleted file mode 100644 index 6087c30f1..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/GlobalConfigPersistable.java +++ /dev/null @@ -1,121 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.ui.data; - -import lombok.Getter; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.deeplearning4j.arbiter.ui.module.ArbiterModule; -import org.deeplearning4j.core.storage.Persistable; - -import java.io.IOException; - -/** - * - * A {@link Persistable} implemention for global settings - * @author Alex Black - */ -@Getter -public class GlobalConfigPersistable extends BaseJavaPersistable { - public static final String GLOBAL_WORKER_ID = "global"; - - private String optimizationConfigJson; - private int[] candidateCounts; //queued, completed, failed, total - private String optimizationRunner; - - public GlobalConfigPersistable(String sessionId, long timestamp){ - super(sessionId, timestamp); - } - - public GlobalConfigPersistable(Builder builder){ - super(builder); - this.optimizationConfigJson = builder.optimizationConfigJson; - this.candidateCounts = builder.candidateCounts; - if(this.candidateCounts == null){ - this.candidateCounts = new int[4]; - } - this.optimizationRunner = builder.optimizationRunner; - } - - public GlobalConfigPersistable(){ - //No-arg costructor for Pesistable encoding/decoding - } - - @Override - public String getTypeID() { - return ArbiterModule.ARBITER_UI_TYPE_ID; - } - - @Override - public String getWorkerID() { - return GLOBAL_WORKER_ID; - } - - - public OptimizationConfiguration getOptimizationConfiguration(){ - try { - return JsonMapper.getMapper().readValue(optimizationConfigJson, OptimizationConfiguration.class); - } catch (IOException e){ - throw new RuntimeException(e); - } - } - - public int getCandidatesQueued(){ - return candidateCounts[0]; - } - - public int getCandidatesCompleted(){ - return candidateCounts[1]; - } - - public int getCandidatesFailed(){ - return candidateCounts[2]; - } - - public int getCandidatesTotal(){ - return candidateCounts[3]; - } - - public static class Builder extends BaseJavaPersistable.Builder{ - - private String optimizationConfigJson; - private int[] candidateCounts; //queued, completed, failed, total - private String optimizationRunner; - - public Builder optimizationConfigJson(String optimizationConfigJson){ - this.optimizationConfigJson = optimizationConfigJson; - return this; - } - - public Builder candidateCounts(int queued, int completed, int failed, int total){ - this.candidateCounts = new int[]{queued, completed, failed, total}; - return this; - } - - public Builder optimizationRunner(String optimizationRunner){ - this.optimizationRunner = optimizationRunner; - return this; - } - - public GlobalConfigPersistable build(){ - return new GlobalConfigPersistable(this); - } - - } -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/ModelInfoPersistable.java b/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/ModelInfoPersistable.java deleted file mode 100644 index 7929c55e5..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/data/ModelInfoPersistable.java +++ /dev/null @@ -1,165 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.ui.data; - -import lombok.Data; -import org.deeplearning4j.arbiter.optimize.runner.CandidateStatus; -import org.deeplearning4j.core.storage.Persistable; - -/** - * A {@link Persistable} implemention for model results - i.e., results for - * each model - * - * @author Alex BLack - */ -@Data -public class ModelInfoPersistable extends BaseJavaPersistable { - - private String workerId; - private Integer modelIdx; - private Double score; - private CandidateStatus status; - private long lastUpdateTime; - private long numParameters; - private int numLayers; - //From candidate generator - this + model hyperparam space means we can work out specific hyperparam - // settings for this model - private double[] paramSpaceValues; - private int totalNumUpdates; - //Values for score vs. iteration chart - private int[] iter; - private float[] scoreVsIter; - private String modelConfigJson; - private String exceptionStackTrace; - - public ModelInfoPersistable(String sessionId, String workerId, long timeStamp){ - super(sessionId, timeStamp); - - this.workerId = workerId; - } - - private ModelInfoPersistable(Builder builder){ - super(builder); - this.workerId = builder.workerId; - this.modelIdx = builder.modelIdx; - this.score = builder.score; - this.status = builder.status; - this.iter = builder.iter; - this.scoreVsIter = builder.scoreVsIter; - this.lastUpdateTime = builder.lastUpdateTime; - this.numParameters = builder.numParameters; - this.numLayers = builder.numLayers; - this.paramSpaceValues = builder.paramSpaceValues; - this.modelConfigJson = builder.modelConfigJson; - this.totalNumUpdates = builder.totalNumUpdates; - this.exceptionStackTrace = builder.exceptionStackTrace; - } - - public ModelInfoPersistable(){ - //No-arg costructor for Pesistable encoding/decoding - } - - @Override - public String getWorkerID() { - return workerId; - } - - - public static class Builder extends BaseJavaPersistable.Builder { - - private String workerId; - private Integer modelIdx; - private Double score; - private CandidateStatus status; - private long lastUpdateTime; - private long numParameters; - private int numLayers; - private int totalNumUpdates; - private double[] paramSpaceValues; - private int[] iter; - private float[] scoreVsIter; - private String modelConfigJson; - private String exceptionStackTrace; - - public Builder workerId(String workerId){ - this.workerId = workerId; - return this; - } - - public Builder modelIdx(Integer idx){ - this.modelIdx = idx; - return this; - } - - public Builder score(Double score){ - this.score = score; - return this; - } - - public Builder status(CandidateStatus status){ - this.status = status; - return this; - } - - public Builder scoreVsIter(int[] iter, float[] scoreVsIter){ - this.iter = iter; - this.scoreVsIter = scoreVsIter; - return this; - } - - public Builder lastUpdateTime(long lastUpdateTime){ - this.lastUpdateTime = lastUpdateTime; - return this; - } - - public Builder numParameters(long numParameters){ - this.numParameters = numParameters; - return this; - } - - public Builder numLayers(int numLayers){ - this.numLayers = numLayers; - return this; - } - - public Builder totalNumUpdates(int totalNumUpdates){ - this.totalNumUpdates = totalNumUpdates; - return this; - } - - public Builder paramSpaceValues(double[] paramSpaceValues){ - this.paramSpaceValues = paramSpaceValues; - return this; - } - - public Builder modelConfigJson(String modelConfigJson){ - this.modelConfigJson = modelConfigJson; - return this; - } - - public Builder exceptionStackTrace(String exceptionStackTrace){ - this.exceptionStackTrace = exceptionStackTrace; - return this; - } - - public ModelInfoPersistable build(){ - return new ModelInfoPersistable(this); - } - } -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/listener/ArbiterStatusListener.java b/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/listener/ArbiterStatusListener.java deleted file mode 100644 index ca7249357..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/listener/ArbiterStatusListener.java +++ /dev/null @@ -1,238 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.ui.listener; - -import it.unimi.dsi.fastutil.floats.FloatArrayList; -import it.unimi.dsi.fastutil.ints.IntArrayList; -import lombok.NonNull; -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.core.storage.Persistable; -import org.deeplearning4j.core.storage.StatsStorageRouter; -import org.deeplearning4j.arbiter.optimize.api.OptimizationResult; -import org.deeplearning4j.arbiter.optimize.runner.CandidateInfo; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; -import org.deeplearning4j.arbiter.optimize.serde.jackson.JsonMapper; -import org.deeplearning4j.arbiter.ui.data.GlobalConfigPersistable; -import org.deeplearning4j.arbiter.ui.data.ModelInfoPersistable; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.nd4j.common.primitives.Pair; - -import java.io.IOException; -import java.util.Map; -import java.util.UUID; -import java.util.concurrent.ConcurrentHashMap; - -/** - * A {@link StatusListener} for reporting Arbiter/DL4J optimization results to a {@link StatsStorageRouter} - * - * @author Alex Black - */ -@Slf4j -public class ArbiterStatusListener implements StatusListener { - - public static final int MAX_SCORE_VS_ITER_PTS = 1024; //Above this: subsample... every 2nd, 4th, 8th etc - - private final String sessionId; - private final StatsStorageRouter statsStorage; - - private String ocJson; - private long startTime = 0; - - private Map candidateScoreVsIterSubsampleFreq = new ConcurrentHashMap<>(); - private Map> candidateScoreVsIter = new ConcurrentHashMap<>(); - - private Map lastModelInfoPersistable = new ConcurrentHashMap<>(); - - public ArbiterStatusListener(@NonNull StatsStorageRouter statsStorage) { - this(UUID.randomUUID().toString(), statsStorage); - } - - public ArbiterStatusListener(@NonNull String sessionId, @NonNull StatsStorageRouter statsStorage){ - this.sessionId = sessionId; - this.statsStorage = statsStorage; - } - - @Override - public void onInitialization(IOptimizationRunner r) { - Persistable p = getNewStatusPersistable(r); - statsStorage.putStaticInfo(p); - } - - @Override - public void onShutdown(IOptimizationRunner runner) { - //No op? - - } - - @Override - public void onRunnerStatusChange(IOptimizationRunner r) { - Persistable p = getNewStatusPersistable(r); - statsStorage.putStaticInfo(p); - } - - @Override - public void onCandidateStatusChange(CandidateInfo candidateInfo, IOptimizationRunner runner, OptimizationResult result) { - ModelInfoPersistable p = lastModelInfoPersistable.get(candidateInfo.getIndex()); - if(p == null){ - p = new ModelInfoPersistable.Builder() - .timestamp(candidateInfo.getCreatedTime()) - .sessionId(sessionId) - .workerId(String.valueOf(candidateInfo.getIndex())) - .modelIdx(candidateInfo.getIndex()) - .score(candidateInfo.getScore()) - .status(candidateInfo.getCandidateStatus()) - .exceptionStackTrace(candidateInfo.getExceptionStackTrace()) - .build(); - - lastModelInfoPersistable.put(candidateInfo.getIndex(), p); - } - - if(p.getScore() == null){ - p.setScore(candidateInfo.getScore()); - } - - if(result != null && p.getExceptionStackTrace() == null && result.getCandidateInfo().getExceptionStackTrace() != null){ - //Update exceptions that may have occurred since earlier model info instance - p.setExceptionStackTrace(result.getCandidateInfo().getExceptionStackTrace()); - } - - p.setStatus(candidateInfo.getCandidateStatus()); - - statsStorage.putUpdate(p); - } - - @Override - public void onCandidateIteration(CandidateInfo candidateInfo, Object candidate, int iteration) { - double score; - long numParams; - int numLayers; - String modelConfigJson; - int totalNumUpdates; - if(candidate instanceof MultiLayerNetwork){ - MultiLayerNetwork m = (MultiLayerNetwork)candidate; - score = m.score(); - numParams = m.numParams(); - numLayers = m.getnLayers(); - modelConfigJson = m.getLayerWiseConfigurations().toJson(); - totalNumUpdates = m.getLayerWiseConfigurations().getIterationCount(); - } else if(candidate instanceof ComputationGraph) { - ComputationGraph cg = (ComputationGraph)candidate; - score = cg.score(); - numParams = cg.numParams(); - numLayers = cg.getNumLayers(); - modelConfigJson = cg.getConfiguration().toJson(); - totalNumUpdates = cg.getConfiguration().getIterationCount(); - } else { - score = 0; - numParams = 0; - numLayers = 0; - totalNumUpdates = 0; - modelConfigJson = ""; - } - - int idx = candidateInfo.getIndex(); - - Pair pair = candidateScoreVsIter.computeIfAbsent(idx, k -> new Pair<>(new IntArrayList(), new FloatArrayList())); - - IntArrayList iter = pair.getFirst(); - FloatArrayList scores = pair.getSecond(); - - //Do we need subsampling to avoid having too many data points? - int subsamplingFreq = candidateScoreVsIterSubsampleFreq.computeIfAbsent(idx, k -> 1); - if(iteration / subsamplingFreq > MAX_SCORE_VS_ITER_PTS){ - //Double subsampling frequency and re-parse data - subsamplingFreq *= 2; - candidateScoreVsIterSubsampleFreq.put(idx, subsamplingFreq); - - IntArrayList newIter = new IntArrayList(); - FloatArrayList newScores = new FloatArrayList(); - for( int i=0; i(iter, scores)); - } - - if(iteration % subsamplingFreq == 0) { - iter.add(iteration); - scores.add((float) score); - } - - - int[] iters = iter.toIntArray(); - float[] fScores = new float[iters.length]; - for( int i=0; i T fromJson(String json, Class type){ - try{ - return getMapper().readValue(json, type); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - - public static ObjectMapper getInstance(){ - return MAPPER; - } - -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/misc/UIUtils.java b/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/misc/UIUtils.java deleted file mode 100644 index fb99c552a..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/misc/UIUtils.java +++ /dev/null @@ -1,114 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.ui.misc; - -import org.joda.time.Period; -import org.joda.time.PeriodType; -import org.joda.time.format.PeriodFormatter; -import org.joda.time.format.PeriodFormatterBuilder; - -/** - * Created by Alex on 20/07/2017. - */ -public class UIUtils { - - /** - * Convert the "messy" min/max values on a dataset to something clean. For example, 0.895732 becomes 1.0 - * - * @param max Maximum data point value - * @param min Minimum data point value - * @param nTick Number of tick marks desired on chart (good setting: 5) - * @return double[] of length 2 - with new minimum and maximum - */ - public static double[] graphNiceRange(double max, double min, int nTick){ - if(max == min || !Double.isFinite(max)){ - if(max == 0.0 || !Double.isFinite(max)){ - return new double[]{0.0, 1.0}; - } - - return graphNiceRange(1.5 * max, 0.5 * max, nTick); - } - - double range = niceNum(max-min, false); - double d = niceNum(range / (nTick-1), true ); - double graphMin = Math.floor(min/d)*d; - double graphMax = Math.ceil(max/d)*d; - - - return new double[]{graphMin, graphMax}; - } - - public static double niceNum(double x, boolean round){ - double exp = Math.floor(Math.log10(x)); - double f = x / Math.pow(10, exp); - - double nf; - if(round){ - if(f < 1.5 ){ - nf = 1; - } else if( f < 3){ - nf = 2; - } else if( f < 7){ - nf = 5; - } else { - nf = 10; - } - } else { - if(f <= 1 ){ - nf = 1; - } else if( f <= 2){ - nf = 2; - } else if( f <= 5){ - nf = 5; - } else { - nf = 10; - } - } - return nf * Math.pow(10, exp); - } - - /** - * Format the duration in milliseconds to a human readable String, with "yr", "days", "hr" etc prefixes - * - * - * @param durationMs Duration in milliseconds - * @return Human readable string - */ - public static String formatDuration(long durationMs){ - Period period = Period.seconds((int)(durationMs/1000L)); - Period p2 = period.normalizedStandard(PeriodType.yearMonthDayTime()); - - PeriodFormatter formatter = new PeriodFormatterBuilder() - .appendYears() - .appendSuffix(" yr ") - .appendMonths() - .appendSuffix(" months ") - .appendDays() - .appendSuffix(" days ") - .appendHours() - .appendSuffix(" hr ") - .appendMinutes() - .appendSuffix(" min ") - .appendSeconds() - .appendSuffix(" sec") - .toFormatter(); - - return formatter.print(p2); - } -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/module/ArbiterModule.java b/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/module/ArbiterModule.java deleted file mode 100644 index 29611e24f..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/java/org/deeplearning4j/arbiter/ui/module/ArbiterModule.java +++ /dev/null @@ -1,944 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.ui.module; - -import io.netty.handler.codec.http.HttpResponseStatus; -import io.vertx.ext.web.RoutingContext; -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.core.storage.Persistable; -import org.deeplearning4j.core.storage.StatsStorage; -import org.deeplearning4j.core.storage.StatsStorageEvent; -import org.deeplearning4j.core.storage.StatsStorageListener; -import org.deeplearning4j.arbiter.BaseNetworkSpace; -import org.deeplearning4j.arbiter.layers.LayerSpace; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.termination.TerminationCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.runner.CandidateStatus; -import org.deeplearning4j.arbiter.ui.UpdateStatus; -import org.deeplearning4j.arbiter.ui.data.GlobalConfigPersistable; -import org.deeplearning4j.arbiter.ui.data.ModelInfoPersistable; -import org.deeplearning4j.arbiter.ui.misc.UIUtils; -import org.deeplearning4j.arbiter.util.ObjectUtils; -import org.deeplearning4j.nn.conf.serde.JsonMappers; -import org.deeplearning4j.ui.VertxUIServer; -import org.deeplearning4j.ui.api.Component; -import org.deeplearning4j.ui.api.*; -import org.deeplearning4j.ui.components.chart.ChartLine; -import org.deeplearning4j.ui.components.chart.ChartScatter; -import org.deeplearning4j.ui.components.chart.style.StyleChart; -import org.deeplearning4j.ui.components.component.ComponentDiv; -import org.deeplearning4j.ui.components.component.style.StyleDiv; -import org.deeplearning4j.ui.components.table.ComponentTable; -import org.deeplearning4j.ui.components.table.style.StyleTable; -import org.deeplearning4j.ui.components.text.ComponentText; -import org.deeplearning4j.ui.components.text.style.StyleText; -import org.deeplearning4j.ui.i18n.I18NResource; -import org.joda.time.format.DateTimeFormat; -import org.joda.time.format.DateTimeFormatter; -import org.nd4j.common.function.Function; -import org.nd4j.common.primitives.Pair; -import org.nd4j.shade.jackson.core.JsonProcessingException; - -import java.awt.*; -import java.text.DecimalFormat; -import java.util.List; -import java.util.*; -import java.util.concurrent.atomic.AtomicBoolean; - -/** - * A Deeplearning4j {@link UIModule}, for integration with DL4J's user interface - * - * @author Alex Black - */ -@Slf4j -public class ArbiterModule implements UIModule { - - private static final DecimalFormat DECIMAL_FORMAT_2DP = new DecimalFormat("#.00"); - private static final DateTimeFormatter TIME_FORMATTER = DateTimeFormat.forPattern("YYYY-MM-dd HH:mm ZZ"); - public static final String ARBITER_UI_TYPE_ID = "ArbiterUI"; - - private AtomicBoolean loggedArbiterAddress = new AtomicBoolean(false); - private Map knownSessionIDs = Collections.synchronizedMap(new LinkedHashMap<>()); - private String currentSessionID; - - private Map lastUpdateForSession = Collections.synchronizedMap(new HashMap<>()); - - //Styles for UI: - private static final StyleTable STYLE_TABLE = new StyleTable.Builder() - .width(100, LengthUnit.Percent) - .backgroundColor(Color.WHITE) - .borderWidth(1) - .columnWidths(LengthUnit.Percent, 30, 70) - .build(); - - private static final StyleTable STYLE_TABLE3_25_25_50 = new StyleTable.Builder() - .width(100, LengthUnit.Percent) - .backgroundColor(Color.WHITE) - .borderWidth(1) - .columnWidths(LengthUnit.Percent, 25, 25, 50) - .build(); - - private static final StyleDiv STYLE_DIV_WIDTH_100_PC = new StyleDiv.Builder() - .width(100, LengthUnit.Percent) - .build(); - - private static final ComponentDiv DIV_SPACER_20PX = new ComponentDiv(new StyleDiv.Builder() - .width(100,LengthUnit.Percent) - .height(20, LengthUnit.Px).build()); - - private static final ComponentDiv DIV_SPACER_60PX = new ComponentDiv(new StyleDiv.Builder() - .width(100,LengthUnit.Percent) - .height(60, LengthUnit.Px).build()); - - private static final StyleChart STYLE_CHART_560_320 = new StyleChart.Builder() - .width(560, LengthUnit.Px) - .height(320, LengthUnit.Px) - .build(); - - private static final StyleChart STYLE_CHART_800_400 = new StyleChart.Builder() - .width(800, LengthUnit.Px) - .height(400, LengthUnit.Px) - .build(); - - - private StyleText STYLE_TEXT_SZ12 = new StyleText.Builder() - .fontSize(12) - .build(); - - //Set whitespacePre(true) to avoid losing new lines, tabs, multiple spaces etc - private StyleText STYLE_TEXT_SZ10_WHITESPACE_PRE = new StyleText.Builder() - .fontSize(10) - .whitespacePre(true) - .build(); - - - @Override - public List getCallbackTypeIDs() { - return Collections.singletonList(ARBITER_UI_TYPE_ID); - } - - @Override - public List getRoutes() { - boolean multiSession = VertxUIServer.getMultiSession().get(); - List r = new ArrayList<>(); - r.add(new Route("/arbiter/multisession", HttpMethod.GET, - (path, rc) -> rc.response().end(multiSession ? "true" : "false"))); - if (multiSession) { - r.add(new Route("/arbiter", HttpMethod.GET, (path, rc) -> this.listSessions(rc))); - r.add(new Route("/arbiter/:sessionId", HttpMethod.GET, (path, rc) -> { - if (knownSessionIDs.containsKey(path.get(0))) { - rc.response() - .putHeader("content-type", "text/html; charset=utf-8") - .sendFile("templates/ArbiterUI.html"); - } else { - sessionNotFound(path.get(0), rc.request().path(), rc); - } - })); - - r.add(new Route("/arbiter/:sessionId/lastUpdate", HttpMethod.GET, (path, rc) -> { - if (knownSessionIDs.containsKey(path.get(0))) { - this.getLastUpdateTime(path.get(0), rc); - } else { - sessionNotFound(path.get(0), rc.request().path(), rc); - } - })); - r.add(new Route("/arbiter/:sessionId/candidateInfo/:id", HttpMethod.GET, (path, rc) -> { - if (knownSessionIDs.containsKey(path.get(0))) { - this.getCandidateInfo(path.get(0), path.get(1), rc); - } else { - sessionNotFound(path.get(0), rc.request().path(), rc); - } - })); - r.add(new Route("/arbiter/:sessionId/config", HttpMethod.GET, (path, rc) -> { - if (knownSessionIDs.containsKey(path.get(0))) { - this.getOptimizationConfig(path.get(0), rc); - } else { - sessionNotFound(path.get(0), rc.request().path(), rc); - } - })); - r.add(new Route("/arbiter/:sessionId/results", HttpMethod.GET, (path, rc) -> { - if (knownSessionIDs.containsKey(path.get(0))) { - this.getSummaryResults(path.get(0), rc); - } else { - sessionNotFound(path.get(0), rc.request().path(), rc); - } - })); - r.add(new Route("/arbiter/:sessionId/summary", HttpMethod.GET, (path, rc) -> { - if (knownSessionIDs.containsKey(path.get(0))) { - this.getSummaryStatus(path.get(0), rc); - } else { - sessionNotFound(path.get(0), rc.request().path(), rc); - } - })); - } else { - r.add(new Route("/arbiter", HttpMethod.GET, (path, rc) -> rc.response() - .putHeader("content-type", "text/html; charset=utf-8") - .sendFile("templates/ArbiterUI.html"))); - r.add(new Route("/arbiter/lastUpdate", HttpMethod.GET, (path, rc) -> this.getLastUpdateTime(null, rc))); - r.add(new Route("/arbiter/candidateInfo/:id", HttpMethod.GET, - (path, rc) -> this.getCandidateInfo(null, path.get(0), rc))); - r.add(new Route("/arbiter/config", HttpMethod.GET, (path, rc) -> this.getOptimizationConfig(null, rc))); - r.add(new Route("/arbiter/results", HttpMethod.GET, (path, rc) -> this.getSummaryResults(null, rc))); - r.add(new Route("/arbiter/summary", HttpMethod.GET, (path, rc) -> this.getSummaryStatus(null, rc))); - - r.add(new Route("/arbiter/sessions/current", HttpMethod.GET, (path, rc) -> this.currentSession(rc))); - r.add(new Route("/arbiter/sessions/set/:to", HttpMethod.GET, - (path, rc) -> this.setSession(path.get(0), rc))); - } - // common for single- and multi-session mode - r.add(new Route("/arbiter/sessions/all", HttpMethod.GET, (path, rc) -> this.sessionInfo(rc))); - - return r; - } - - - /** - * Load StatsStorage via provider, or return "not found" - * - * @param sessionId session ID to look fo with provider - * @param targetPath one of overview / model / system, or null - * @param rc routing context - */ - private void sessionNotFound(String sessionId, String targetPath, RoutingContext rc) { - Function loader = VertxUIServer.getInstance().getStatsStorageLoader(); - if (loader != null && loader.apply(sessionId)) { - if (targetPath != null) { - rc.reroute(targetPath); - } else { - rc.response().end(); - } - } else { - rc.response().setStatusCode(HttpResponseStatus.NOT_FOUND.code()) - .end("Unknown session ID: " + sessionId); - } - } - - - /** - * List optimization sessions. Returns a HTML list of arbiter sessions - */ - private synchronized void listSessions(RoutingContext rc) { - StringBuilder sb = new StringBuilder("\n" + - "\n" + - "\n" + - " \n" + - " Optimization sessions - DL4J Arbiter UI\n" + - " \n" + - "\n" + - " \n" + - "

DL4J Arbiter UI

\n" + - "

UI server is in multi-session mode." + - " To visualize an optimization session, please select one from the following list.

\n" + - "

List of attached optimization sessions

\n"); - if (!knownSessionIDs.isEmpty()) { - sb.append(" "); - } else { - sb.append("No optimization session attached."); - } - - sb.append(" \n" + - "\n"); - - rc.response() - .putHeader("content-type", "text/html; charset=utf-8") - .end(sb.toString()); - } - - @Override - public void reportStorageEvents(Collection events) { - boolean attachedArbiter = false; - for (StatsStorageEvent sse : events) { - if (ARBITER_UI_TYPE_ID.equals(sse.getTypeID())) { - if (sse.getEventType() == StatsStorageListener.EventType.PostStaticInfo) { - knownSessionIDs.put(sse.getSessionID(), sse.getStatsStorage()); - } - - Long lastUpdate = lastUpdateForSession.get(sse.getSessionID()); - if (lastUpdate == null) { - lastUpdateForSession.put(sse.getSessionID(), sse.getTimestamp()); - } else if (sse.getTimestamp() > lastUpdate) { - lastUpdateForSession.put(sse.getSessionID(), sse.getTimestamp()); //Should be thread safe - read only elsewhere - } - attachedArbiter = true; - } - } - - if(currentSessionID == null){ - getDefaultSession(); - } - - if(attachedArbiter && !loggedArbiterAddress.getAndSet(true)){ - String address = UIServer.getInstance().getAddress(); - address += "/arbiter"; - log.info("DL4J Arbiter Hyperparameter Optimization UI: {}", address); - } - } - - @Override - public synchronized void onAttach(StatsStorage statsStorage) { - for (String sessionID : statsStorage.listSessionIDs()) { - for (String typeID : statsStorage.listTypeIDsForSession(sessionID)) { - if (!ARBITER_UI_TYPE_ID.equals(typeID)) - continue; - knownSessionIDs.put(sessionID, statsStorage); - } - } - - if (currentSessionID == null) - getDefaultSession(); - } - - private void currentSession(RoutingContext rc) { - String sid = currentSessionID == null ? "" : currentSessionID; - rc.response() - .putHeader("content-type", "application/json") - .end(asJson(sid)); - } - - private void sessionInfo(RoutingContext rc) { - rc.response() - .putHeader("content-type", "application/json") - .end(asJson(knownSessionIDs.keySet())); - } - - private void setSession(String newSessionID, RoutingContext rc) { - log.debug("Arbiter UI: Set to session {}", newSessionID); - - if (knownSessionIDs.containsKey(newSessionID)) { - currentSessionID = newSessionID; - rc.response().end(); - } else { - rc.response().setStatusCode(HttpResponseStatus.BAD_REQUEST.code()).end("Unknown session ID: " + newSessionID); - } - } - - private void getDefaultSession() { - if (currentSessionID != null) - return; - - long mostRecentTime = Long.MIN_VALUE; - String sessionID = null; - for (Map.Entry entry : knownSessionIDs.entrySet()) { - List staticInfos = entry.getValue().getAllStaticInfos(entry.getKey(), ARBITER_UI_TYPE_ID); - if (staticInfos == null || staticInfos.size() == 0) - continue; - Persistable p = staticInfos.get(0); - long thisTime = p.getTimeStamp(); - if (thisTime > mostRecentTime) { - mostRecentTime = thisTime; - sessionID = entry.getKey(); - } - } - - if (sessionID != null) { - currentSessionID = sessionID; - } - } - - @Override - public void onDetach(StatsStorage statsStorage) { - for (String s : knownSessionIDs.keySet()) { - if (knownSessionIDs.get(s) == statsStorage) { - knownSessionIDs.remove(s); - } - } - } - - @Override - public List getInternationalizationResources() { - return Collections.emptyList(); - } - - /** - * Return the last update time for the page - * @param sessionId session ID (optional, for multi-session mode) - * @param rc routing context - */ - private void getLastUpdateTime(String sessionId, RoutingContext rc){ - if (sessionId == null) { - sessionId = currentSessionID; - } - StatsStorage ss = knownSessionIDs.get(sessionId); - List latestUpdates = ss.getLatestUpdateAllWorkers(sessionId, ARBITER_UI_TYPE_ID); - long t = 0; - if (latestUpdates.isEmpty()) { - t = System.currentTimeMillis(); - } else { - for (Persistable update : latestUpdates) { - if (update.getTimeStamp() > t) { - t = update.getTimeStamp(); - } - } - } - UpdateStatus us = new UpdateStatus(t, t, t); - - rc.response().putHeader("content-type", "application/json").end(asJson(us)); - } - - private String asJson(Object o){ - try{ - return JsonMappers.getMapper().writeValueAsString(o); - } catch (JsonProcessingException e){ - throw new RuntimeException("Error converting object to JSON", e); - } - } - - /** - * Get the info for a specific candidate - last section in the UI - * @param sessionId session ID (optional, for multi-session mode) - * @param candidateId ID for the candidate - * @param rc routing context - */ - private void getCandidateInfo(String sessionId, String candidateId, RoutingContext rc){ - if (sessionId == null) { - sessionId = currentSessionID; - } - StatsStorage ss = knownSessionIDs.get(sessionId); - if(ss == null){ - log.debug("getModelLastUpdateTimes(): Session ID is unknown: {}", sessionId); - rc.response().end(); - return; - } - - GlobalConfigPersistable gcp = (GlobalConfigPersistable)ss - .getStaticInfo(sessionId, ARBITER_UI_TYPE_ID, GlobalConfigPersistable.GLOBAL_WORKER_ID); - OptimizationConfiguration oc = gcp.getOptimizationConfiguration(); - - Persistable p = ss.getLatestUpdate(sessionId, ARBITER_UI_TYPE_ID, candidateId); - if(p == null){ - String title = "No results found for model " + candidateId + "."; - ComponentText ct = new ComponentText.Builder(title,STYLE_TEXT_SZ12).build(); - rc.response() - .putHeader("content-type", "application/json") - .end(asJson(ct)); - return; - } - - ModelInfoPersistable mip = (ModelInfoPersistable)p; - - //First: static info - // Hyperparameter configuration/settings - // Number of parameters - // Maybe memory info in the future? - - //Second: dynamic info - //Runtime - // Performance stats (total minibatches, total time, - // Score vs. time - - List components = new ArrayList<>(); - - //First table: mix of static + dynamic in a table - long runtimeDurationMs = mip.getLastUpdateTime() - mip.getTimeStamp(); - double avgMinibatchesPerSec = mip.getTotalNumUpdates() / (runtimeDurationMs/1000.0); - String avgMinibatchesPerSecStr = DECIMAL_FORMAT_2DP.format(avgMinibatchesPerSec); - String runtimeStr = UIUtils.formatDuration(runtimeDurationMs); - - if(mip.getStatus() == CandidateStatus.Failed){ - runtimeStr = ""; - avgMinibatchesPerSecStr = ""; - } - - String[][] table = new String[][]{ - {"Model Index", String.valueOf(mip.getModelIdx())}, - {"Status", mip.getStatus().toString()}, - {"Model Score", mip.getScore() == null ? "" : String.valueOf(mip.getScore())}, - {"Created", TIME_FORMATTER.print(mip.getTimeStamp())}, - {"Runtime", runtimeStr}, - {"Total Number of Model Updates", String.valueOf(mip.getTotalNumUpdates())}, - {"Average # Updates / Sec", avgMinibatchesPerSecStr}, - {"Number of Parameters", String.valueOf(mip.getNumParameters())}, - {"Number of Layers", String.valueOf(mip.getNumLayers())} - }; - - ComponentTable cTable = new ComponentTable.Builder(STYLE_TABLE) - .content(table) - .header("Model Information", "") - .build(); - components.add(cTable); - - - //Second: parameter space values, in multiple tables - double[] paramSpaceValues = mip.getParamSpaceValues(); - if(paramSpaceValues != null){ - BaseNetworkSpace bns = (BaseNetworkSpace)oc.getCandidateGenerator().getParameterSpace(); - Map m = bns.getNestedSpaces(); - - String[][] hSpaceTable = new String[m.size()][3]; - int i=0; - for(Map.Entry e : m.entrySet()){ - hSpaceTable[i][0] = e.getKey(); - Object currCandidateValue = e.getValue().getValue(paramSpaceValues); - hSpaceTable[i][1] = ObjectUtils.valueToString(currCandidateValue); - hSpaceTable[i][2] = e.getValue().toString(); - i++; - } - - String[] hSpaceTableHeader = new String[]{"Hyperparameter", "Model Value", "Hyperparameter Space"}; - - ComponentTable ct2 = new ComponentTable.Builder(STYLE_TABLE3_25_25_50) - .content(hSpaceTable) - .header(hSpaceTableHeader) - .build(); - - - String title = "Global Network Configuration"; - components.add(DIV_SPACER_20PX); - components.add(new ComponentText.Builder(title, STYLE_TEXT_SZ12).build()); - components.add(ct2); - - List layerConfs = bns.getLayerSpaces(); - - for(BaseNetworkSpace.LayerConf l : layerConfs){ - LayerSpace ls = l.getLayerSpace(); - Map lpsm = ls.getNestedSpaces(); - - String[][] t = new String[lpsm.size()][3]; - i=0; - for(Map.Entry e : lpsm.entrySet()){ - t[i][0] = e.getKey(); - Object currCandidateValue = e.getValue().getValue(paramSpaceValues); - t[i][1] = ObjectUtils.valueToString(currCandidateValue); - t[i][2] = e.getValue().toString(); - i++; - } - - ComponentTable ct3 = new ComponentTable.Builder(STYLE_TABLE3_25_25_50) - .content(t) - .header(hSpaceTableHeader) - .build(); - - title = "Layer Space: " + ls.getClass().getSimpleName() + ", Name: " + l.getLayerName(); - - components.add(DIV_SPACER_20PX); - components.add(new ComponentText.Builder(title, STYLE_TEXT_SZ12).build()); - components.add(ct3); - } - } - - - //Third: Score vs. time chart - int[] iters = mip.getIter(); - float[] scores = mip.getScoreVsIter(); - - if(iters != null) { - double[] si = new double[iters.length]; - double[] scoresD = new double[iters.length]; - - double minScore = Double.MAX_VALUE; - double maxScore = -Double.MAX_VALUE; - for( int i=0; i components = new ArrayList<>(); - - GlobalConfigPersistable gcp = (GlobalConfigPersistable)p; - OptimizationConfiguration oc = gcp.getOptimizationConfiguration(); - - //Report optimization settings/configuration. - String[] tableHeader = {"Configuration", "Value"}; - String [] dataSourceOrProvider; - if (oc.getDataProvider() != null) { - dataSourceOrProvider = new String[] {"Data Provider", oc.getDataProvider().toString()}; - } - else { - dataSourceOrProvider = new String[] {"Data Source", oc.getDataSource().getCanonicalName()}; - } - String[][] table = new String[][]{ - {"Candidate Generator", oc.getCandidateGenerator().getClass().getSimpleName()}, - dataSourceOrProvider, - {"Score Function", oc.getScoreFunction().toString()}, - {"Result Saver", oc.getResultSaver().toString()}, - }; - - ComponentTable ct = new ComponentTable.Builder(STYLE_TABLE) - .content(table) - .header(tableHeader) - .build(); - components.add(ct); - - - String title = "Global Network Configuration"; - components.add(DIV_SPACER_20PX); - components.add(new ComponentText.Builder(title, STYLE_TEXT_SZ12).build()); - BaseNetworkSpace ps = (BaseNetworkSpace)oc.getCandidateGenerator().getParameterSpace(); - Map m = ps.getNestedSpaces(); - - String[][] hSpaceTable = new String[m.size()][2]; - int i=0; - for(Map.Entry e : m.entrySet()){ - hSpaceTable[i][0] = e.getKey(); - hSpaceTable[i][1] = e.getValue().toString(); - i++; - } - - components.add(DIV_SPACER_20PX); - String[] hSpaceTableHeader = new String[]{"Hyperparameter", "Hyperparameter Configuration"}; - - ComponentTable ct2 = new ComponentTable.Builder(STYLE_TABLE) - .content(hSpaceTable) - .header(hSpaceTableHeader) - .build(); - components.add(ct2); - - //Configuration for each layer: - List layerConfs = ps.getLayerSpaces(); - for(BaseNetworkSpace.LayerConf l : layerConfs){ - LayerSpace ls = l.getLayerSpace(); - Map lpsm = ls.getNestedSpaces(); - - String[][] t = new String[lpsm.size()][2]; - i=0; - for(Map.Entry e : lpsm.entrySet()){ - t[i][0] = e.getKey(); - t[i][1] = e.getValue().toString(); - i++; - } - - ComponentTable ct3 = new ComponentTable.Builder(STYLE_TABLE) - .content(t) - .header(hSpaceTableHeader) - .build(); - - title = "Layer Space: " + ls.getClass().getSimpleName() + ", Name: " + l.getLayerName(); - - components.add(DIV_SPACER_20PX); - components.add(new ComponentText.Builder(title, STYLE_TEXT_SZ12).build()); - components.add(ct3); - } - - ComponentDiv cd = new ComponentDiv(STYLE_DIV_WIDTH_100_PC, components); - - rc.response().putHeader("content-type", "application/json").end(asJson(cd)); - } - - /** - * Get candidates summary results list - third section on the page: Results table - * @param sessionId session ID (optional, for multi-session mode) - * @param rc routing context - */ - private void getSummaryResults(String sessionId, RoutingContext rc){ - if (sessionId == null) { - sessionId = currentSessionID; - } - StatsStorage ss = knownSessionIDs.get(sessionId); - if(ss == null){ - log.debug("getSummaryResults(): Session ID is unknown: {}", sessionId); - rc.response().end(); - return; - } - - List allModelInfoTemp = new ArrayList<>(ss.getLatestUpdateAllWorkers(sessionId, ARBITER_UI_TYPE_ID)); - List table = new ArrayList<>(); - for(Persistable per : allModelInfoTemp){ - ModelInfoPersistable mip = (ModelInfoPersistable)per; - String score = (mip.getScore() == null ? "" : mip.getScore().toString()); - table.add(new String[]{mip.getModelIdx().toString(), score, mip.getStatus().toString()}); - } - - rc.response().putHeader("content-type", "application/json").end(asJson(table)); - } - - /** - * Get summary status information: first section in the page - * @param sessionId session ID (optional, for multi-session mode) - * @param rc routing context - */ - private void getSummaryStatus(String sessionId, RoutingContext rc){ - if (sessionId == null) { - sessionId = currentSessionID; - } - StatsStorage ss = knownSessionIDs.get(sessionId); - if(ss == null){ - log.debug("getOptimizationConfig(): Session ID is unknown: {}", sessionId); - rc.response().end(); - return; - } - - Persistable p = ss.getStaticInfo(sessionId, ARBITER_UI_TYPE_ID, GlobalConfigPersistable.GLOBAL_WORKER_ID); - - if(p == null){ - log.info("No static info"); - rc.response().end(); - return; - } - - GlobalConfigPersistable gcp = (GlobalConfigPersistable)p; - OptimizationConfiguration oc = gcp.getOptimizationConfiguration(); - long execStartTime = oc.getExecutionStartTime(); - - - - //Charts: - //Best model score vs. time - //All candidate scores (scatter plot vs. time) - - //How to get this? query all model infos... - - List allModelInfoTemp = new ArrayList<>(ss.getLatestUpdateAllWorkers(sessionId, ARBITER_UI_TYPE_ID)); - List allModelInfo = new ArrayList<>(); - for(Persistable per : allModelInfoTemp){ - ModelInfoPersistable mip = (ModelInfoPersistable)per; - if(mip.getStatus() == CandidateStatus.Complete && mip.getScore() != null && Double.isFinite(mip.getScore())){ - allModelInfo.add(mip); - } - } - - allModelInfo.sort(Comparator.comparingLong(Persistable::getTimeStamp)); - - Pair, ModelInfoPersistable> chartsAndBest = getSummaryChartsAndBest(allModelInfo, oc.getScoreFunction().minimize(), execStartTime ); - - //First: table - number completed, queued, running, failed, total - //Best model index, score, and time - //Total runtime - //Termination conditions - List components = new ArrayList<>(); - - - - List tcs = oc.getTerminationConditions(); - - //TODO: I18N - - long bestTime; - Double bestScore = null; - String bestModelString = null; - if(chartsAndBest.getSecond() != null){ - bestTime = chartsAndBest.getSecond().getTimeStamp(); - bestScore = chartsAndBest.getSecond().getScore(); - String sinceBest = UIUtils.formatDuration(System.currentTimeMillis() - bestTime); - - bestModelString = "Model " + chartsAndBest.getSecond().getModelIdx() + ", Found at " + - TIME_FORMATTER.print(bestTime) + " (" + sinceBest + " ago)"; - } - - String execStartTimeStr = ""; - String execTotalRuntimeStr = ""; - if(execStartTime > 0){ - execStartTimeStr = TIME_FORMATTER.print(execStartTime); - // allModelInfo is sorted by Persistable::getTimeStamp - long lastCompleteTime = execStartTime; - if (!allModelInfo.isEmpty()) { - lastCompleteTime = allModelInfo.get(allModelInfo.size() - 1).getTimeStamp(); - } - execTotalRuntimeStr = UIUtils.formatDuration(lastCompleteTime - execStartTime); - } - - - String[][] table = new String[][]{ - {"Models Completed", String.valueOf(gcp.getCandidatesCompleted())}, - {"Models Queued/Running", String.valueOf(gcp.getCandidatesQueued())}, - {"Models Failed", String.valueOf(gcp.getCandidatesFailed())}, - {"Models Total", String.valueOf(gcp.getCandidatesTotal())}, - {"Best Score", (bestScore != null ? String.valueOf(bestScore) : "")}, - {"Best Scoring Model", bestModelString != null ? bestModelString : ""}, - {"Optimization Runner", gcp.getOptimizationRunner()}, - {"Execution Start Time", execStartTimeStr}, - {"Total Runtime", execTotalRuntimeStr} - }; - - - - ComponentTable ct = new ComponentTable.Builder(STYLE_TABLE) - .content(table) - .header("Status", "") - .build(); - - components.add(ct); - - String[][] tcTable = new String[tcs.size()][2]; - for( int i=0; i,ModelInfoPersistable> getSummaryChartsAndBest(List allModelInfo, - boolean minimize, long execStartTime){ - List bestX = new ArrayList<>(); - List bestY = new ArrayList<>(); - - double[] allX = new double[allModelInfo.size()]; - double[] allY = new double[allModelInfo.size()]; - - double bestScore = (minimize ? Double.MAX_VALUE : -Double.MAX_VALUE); - double worstScore = (minimize ? -Double.MAX_VALUE : Double.MAX_VALUE); - double lastTime = -1L; - ModelInfoPersistable bestModel = null; - for(int i=0; i bestScore) || (minimize && currScore < bestScore)){ - bestX.add(t); - bestY.add(bestScore); - bestX.add(t); //TODO non-real time rendering support... - bestY.add(currScore); - - bestScore = currScore; - bestModel = mip; - } - - if((!minimize && currScore < worstScore) || (minimize && currScore > worstScore)){ - worstScore = currScore; - } - - if(t > lastTime){ - lastTime = t; - } - } - - - double[] scatterGraphMinMax = UIUtils.graphNiceRange(Math.max(bestScore, worstScore), Math.min(bestScore, worstScore), 5); - double[] lineGraphMinMax = UIUtils.graphNiceRange( - bestY.stream().mapToDouble(s -> s).max().orElse(0),bestY.stream().mapToDouble(s -> s).min().orElse(0), 5 - ); - - if(bestX.size() > 0) { - bestX.add(lastTime); - bestY.add(bestY.get(bestY.size() - 1)); - } - - - double[] bestXd = new double[bestX.size()]; - double[] bestYd = new double[bestXd.length]; - for( int i=0; i components = new ArrayList<>(2); - - ChartLine cl = new ChartLine.Builder("Best Model Score vs. Time (Minutes)", STYLE_CHART_560_320) - .addSeries("Best Score vs. Time", bestXd, bestYd) - .setYMin(lineGraphMinMax[0]) - .setYMax(lineGraphMinMax[1]) - .build(); - components.add(cl); - - ChartScatter cs = new ChartScatter.Builder("All Candidate Scores vs. Time (Minutes)", STYLE_CHART_560_320) - .addSeries("Candidates", allX, allY) - .setYMin(scatterGraphMinMax[0]) - .setYMax(scatterGraphMinMax[1]) - .build(); - - components.add(cs); - - return new Pair<>(components, bestModel); - } -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/resources/META-INF/services/org.deeplearning4j.ui.api.UIModule b/contrib/attic/arbiter/arbiter-ui/src/main/resources/META-INF/services/org.deeplearning4j.ui.api.UIModule deleted file mode 100644 index 8b6d38db0..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/resources/META-INF/services/org.deeplearning4j.ui.api.UIModule +++ /dev/null @@ -1,193 +0,0 @@ -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -# -# /* ****************************************************************************** -# * Copyright (c) 2021 Deeplearning4j Contributors -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -################################################################################ -# Copyright (c) 2015-2018 Skymind, Inc. -# -# This program and the accompanying materials are made available under the -# terms of the Apache License, Version 2.0 which is available at -# https://www.apache.org/licenses/LICENSE-2.0. -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# License for the specific language governing permissions and limitations -# under the License. -# -# SPDX-License-Identifier: Apache-2.0 -################################################################################ - -org.deeplearning4j.arbiter.ui.module.ArbiterModule \ No newline at end of file diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/resources/deeplearning4jUiAssets/dl4j-ui.js b/contrib/attic/arbiter/arbiter-ui/src/main/resources/deeplearning4jUiAssets/dl4j-ui.js deleted file mode 100644 index 6b549bce9..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/resources/deeplearning4jUiAssets/dl4j-ui.js +++ /dev/null @@ -1,1339 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -var __extends = (this && this.__extends) || function (d, b) { - for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; - function __() { this.constructor = d; } - d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __()); -}; -var Style = (function () { - function Style(jsonObj) { - var _this = this; - this.getWidth = function () { return _this.width; }; - this.getHeight = function () { return _this.height; }; - this.getWidthUnit = function () { return _this.widthUnit; }; - this.getHeightUnit = function () { return _this.heightUnit; }; - this.getMarginTop = function () { return _this.marginTop; }; - this.getMarginBottom = function () { return _this.marginBottom; }; - this.getMarginLeft = function () { return _this.marginLeft; }; - this.getMarginRight = function () { return _this.marginRight; }; - this.getBackgroundColor = function () { return _this.backgroundColor; }; - this.width = jsonObj['width']; - this.height = jsonObj['height']; - this.widthUnit = TSUtils.normalizeLengthUnit(jsonObj['widthUnit']); - this.heightUnit = TSUtils.normalizeLengthUnit(jsonObj['heightUnit']); - this.marginTop = jsonObj['marginTop']; - this.marginBottom = jsonObj['marginBottom']; - this.marginLeft = jsonObj['marginLeft']; - this.marginRight = jsonObj['marginRight']; - this.backgroundColor = jsonObj['backgroundColor']; - } - Style.getMargins = function (s) { - var mTop = (s ? s.getMarginTop() : 0); - var mBottom = (s ? s.getMarginBottom() : 0); - var mLeft = (s ? s.getMarginLeft() : 0); - var mRight = (s ? s.getMarginRight() : 0); - return { top: mTop, - right: mRight, - bottom: mBottom, - left: mLeft, - widthExMargins: s.getWidth() - mLeft - mRight, - heightExMargins: s.getHeight() - mTop - mBottom }; - }; - return Style; -}()); -var ComponentType; -(function (ComponentType) { - ComponentType[ComponentType["ComponentText"] = 0] = "ComponentText"; - ComponentType[ComponentType["ComponentTable"] = 1] = "ComponentTable"; - ComponentType[ComponentType["ComponentDiv"] = 2] = "ComponentDiv"; - ComponentType[ComponentType["ChartHistogram"] = 3] = "ChartHistogram"; - ComponentType[ComponentType["ChartHorizontalBar"] = 4] = "ChartHorizontalBar"; - ComponentType[ComponentType["ChartLine"] = 5] = "ChartLine"; - ComponentType[ComponentType["ChartScatter"] = 6] = "ChartScatter"; - ComponentType[ComponentType["ChartStackedArea"] = 7] = "ChartStackedArea"; - ComponentType[ComponentType["ChartTimeline"] = 8] = "ChartTimeline"; - ComponentType[ComponentType["DecoratorAccordion"] = 9] = "DecoratorAccordion"; -})(ComponentType || (ComponentType = {})); -var Component = (function () { - function Component(componentType) { - this.componentType = componentType; - } - Component.prototype.getComponentType = function () { - return this.componentType; - }; - Component.getComponent = function (jsonStr) { - var json = JSON.parse(jsonStr); - var key; - if (json["componentType"]) - key = json["componentType"]; - else - key = Object.keys(json)[0]; - switch (key) { - case ComponentType[ComponentType.ComponentText]: - return new ComponentText(jsonStr); - case ComponentType[ComponentType.ComponentTable]: - return new ComponentTable(jsonStr); - case ComponentType[ComponentType.ChartHistogram]: - return new ChartHistogram(jsonStr); - case ComponentType[ComponentType.ChartHorizontalBar]: - throw new Error("Horizontal bar chart: not yet implemented"); - case ComponentType[ComponentType.ChartLine]: - return new ChartLine(jsonStr); - case ComponentType[ComponentType.ChartScatter]: - return new ChartScatter(jsonStr); - case ComponentType[ComponentType.ChartStackedArea]: - return new ChartStackedArea(jsonStr); - case ComponentType[ComponentType.ChartTimeline]: - return new ChartTimeline(jsonStr); - case ComponentType[ComponentType.DecoratorAccordion]: - return new DecoratorAccordion(jsonStr); - case ComponentType[ComponentType.ComponentDiv]: - return new ComponentDiv(jsonStr); - default: - throw new Error("Unknown component type \"" + key + "\" or invalid JSON: \"" + jsonStr + "\""); - } - }; - return Component; -}()); -var ChartConstants = (function () { - function ChartConstants() { - } - ChartConstants.DEFAULT_CHART_STROKE_WIDTH = 1.0; - ChartConstants.DEFAULT_CHART_POINT_SIZE = 3.0; - ChartConstants.DEFAULT_AXIS_STROKE_WIDTH = 1.0; - ChartConstants.DEFAULT_TITLE_COLOR = "#000000"; - return ChartConstants; -}()); -var TSUtils = (function () { - function TSUtils() { - } - TSUtils.max = function (input) { - var max = -Number.MAX_VALUE; - for (var i = 0; i < input.length; i++) { - for (var j = 0; j < input[i].length; j++) { - max = Math.max(max, input[i][j]); - } - } - return max; - }; - TSUtils.min = function (input) { - var min = Number.MAX_VALUE; - for (var i = 0; i < input.length; i++) { - for (var j = 0; j < input[i].length; j++) { - min = Math.min(min, input[i][j]); - } - } - return min; - }; - TSUtils.normalizeLengthUnit = function (input) { - if (input == null) - return input; - switch (input.toLowerCase()) { - case "px": - return "px"; - case "percent": - case "%": - return "%"; - case "cm": - return "cm"; - case "mm": - return "mm"; - case "in": - return "in"; - default: - return input; - } - }; - return TSUtils; -}()); -var Chart = (function (_super) { - __extends(Chart, _super); - function Chart(componentType, jsonStr) { - _super.call(this, componentType); - var jsonOrig = JSON.parse(jsonStr); - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[componentType]]; - this.suppressAxisHorizontal = json['suppressAxisHorizontal']; - this.suppressAxisVertical = json['suppressAxisVertical']; - this.showLegend = json['showLegend']; - this.title = json['title']; - this.setXMin = json['setXMin']; - this.setXMax = json['setXMax']; - this.setYMin = json['setYMin']; - this.setYMax = json['setYMax']; - this.gridVerticalStrokeWidth = json['gridVerticalStrokeWidth']; - this.gridHorizontalStrokeWidth = json['gridHorizontalStrokeWidth']; - if (json['style']) - this.style = new StyleChart(json['style']); - } - Chart.prototype.getStyle = function () { - return this.style; - }; - Chart.appendTitle = function (svg, title, margin, titleStyle) { - var text = svg.append("text") - .text(title) - .attr("x", (margin.widthExMargins / 2)) - .attr("y", 0 - ((margin.top - 30) / 2)) - .attr("text-anchor", "middle"); - if (titleStyle) { - if (titleStyle.getFont()) - text.attr("font-family", titleStyle.getFont); - if (titleStyle.getFontSize() != null) - text.attr("font-size", titleStyle.getFontSize() + "pt"); - if (titleStyle.getUnderline() != null) - text.style("text-decoration", "underline"); - if (titleStyle.getColor()) - text.style("fill", titleStyle.getColor); - else - text.style("fill", ChartConstants.DEFAULT_TITLE_COLOR); - } - else { - text.style("text-decoration", "underline"); - text.style("fill", ChartConstants.DEFAULT_TITLE_COLOR); - } - }; - return Chart; -}(Component)); -var ChartHistogram = (function (_super) { - __extends(ChartHistogram, _super); - function ChartHistogram(jsonStr) { - _super.call(this, ComponentType.ChartHistogram, jsonStr); - this.render = function (appendToObject) { - var s = this.getStyle(); - var margin = Style.getMargins(s); - var xMin; - var xMax; - var yMin; - var yMax; - if (this.setXMin) - xMin = this.setXMin; - else - xMin = (this.lowerBounds ? d3.min(this.lowerBounds) : 0); - if (this.setXMax) - xMax = this.setXMax; - else - xMax = (this.upperBounds ? d3.max(this.upperBounds) : 1); - if (this.setYMin) - yMin = this.setYMin; - else - yMin = 0; - if (this.setYMax) - yMax = this.setYMax; - else - yMax = (this.yValues ? d3.max(this.yValues) : 1); - var xScale = d3.scale.linear() - .domain([xMin, xMax]) - .range([0, margin.widthExMargins]); - var xAxis = d3.svg.axis().scale(xScale) - .orient("bottom").ticks(5); - if (this.gridVerticalStrokeWidth && this.gridVerticalStrokeWidth > 0) { - xAxis.innerTickSize(-margin.heightExMargins); - } - var yScale = d3.scale.linear() - .domain([0, yMax]) - .range([margin.heightExMargins, 0]); - var yAxis = d3.svg.axis().scale(yScale) - .orient("left").ticks(5); - if (this.gridHorizontalStrokeWidth && this.gridHorizontalStrokeWidth > 0) { - yAxis.innerTickSize(-margin.widthExMargins); - } - if (this.suppressAxisHorizontal === true) - xAxis.tickValues([]); - if (this.suppressAxisVertical === true) - yAxis.tickValues([]); - var lowerBounds = this.lowerBounds; - var upperBounds = this.upperBounds; - var yValues = this.yValues; - var data = lowerBounds.map(function (d, i) { - return { 'width': upperBounds[i] - lowerBounds[i], 'height': yValues[i], 'offset': lowerBounds[i] }; - }); - var svg = d3.select("#" + appendToObject.attr("id")) - .append("svg") - .style("fill", "none") - .attr("width", s.getWidth()) - .attr("height", s.getHeight()) - .attr("padding", "20px") - .append("g") - .attr("transform", "translate(" + margin.left + "," + margin.top + ")"); - svg.selectAll(".bin") - .data(data) - .enter().append("rect") - .attr("class", "bin") - .style("fill", "steelblue") - .attr("x", function (d) { return xScale(d.offset); }) - .attr("width", function (d) { return xScale(xMin + d.width) - 1; }) - .attr("y", function (d) { return yScale(d.height); }) - .attr("height", function (d) { return margin.heightExMargins - yScale(d.height); }); - var xAxisNode = svg.append("g") - .attr("class", "x axis") - .attr("transform", "translate(0," + margin.heightExMargins + ")") - .style("stroke", "#000") - .style("stroke-width", (s != null && s.getAxisStrokeWidth() != null ? s.getAxisStrokeWidth() : ChartConstants.DEFAULT_AXIS_STROKE_WIDTH)) - .style("fill", "none") - .call(xAxis); - xAxisNode.selectAll('text').style("stroke-width", 0).style("fill", "#000000"); - if (this.gridVerticalStrokeWidth != null) - xAxisNode.selectAll('.axis line').style({ 'stroke-width': this.gridVerticalStrokeWidth }); - var yAxisNode = svg.append("g") - .attr("class", "y axis") - .style("stroke", "#000") - .style("stroke-width", (s != null && s.getAxisStrokeWidth() != null ? s.getAxisStrokeWidth() : ChartConstants.DEFAULT_AXIS_STROKE_WIDTH)) - .style("fill", "none") - .call(yAxis); - yAxisNode.selectAll('text').style("stroke-width", 0).style("fill", "#000000"); - if (this.gridHorizontalStrokeWidth != null) - yAxisNode.selectAll('.axis line').style({ 'stroke-width': this.gridHorizontalStrokeWidth }); - if (this.title) { - var titleStyle; - if (this.style) - titleStyle = this.style.getTitleStyle(); - Chart.appendTitle(svg, this.title, margin, titleStyle); - } - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.ChartHistogram]]; - this.lowerBounds = json['lowerBounds']; - this.upperBounds = json['upperBounds']; - this.yValues = json['yvalues']; - } - return ChartHistogram; -}(Chart)); -var ChartLine = (function (_super) { - __extends(ChartLine, _super); - function ChartLine(jsonStr) { - _super.call(this, ComponentType.ChartLine, jsonStr); - this.render = function (appendToObject) { - var nSeries = (!this.xData ? 0 : this.xData.length); - var s = this.getStyle(); - var margin = Style.getMargins(s); - var xScale = d3.scale.linear().range([0, margin.widthExMargins]); - var yScale = d3.scale.linear().range([margin.heightExMargins, 0]); - var xAxis = d3.svg.axis().scale(xScale) - .orient("bottom").ticks(5); - if (this.gridVerticalStrokeWidth != null && this.gridVerticalStrokeWidth > 0) { - xAxis.innerTickSize(-margin.heightExMargins); - } - var yAxis = d3.svg.axis().scale(yScale) - .orient("left").ticks(5); - if (this.gridHorizontalStrokeWidth != null && this.gridHorizontalStrokeWidth > 0) { - yAxis.innerTickSize(-margin.widthExMargins); - } - if (this.suppressAxisHorizontal === true) - xAxis.tickValues([]); - if (this.suppressAxisVertical === true) - yAxis.tickValues([]); - var valueline = d3.svg.line() - .x(function (d) { - return xScale(d.xPos); - }) - .y(function (d) { - return yScale(d.yPos); - }); - var svg = d3.select("#" + appendToObject.attr("id")) - .append("svg") - .style("stroke-width", (s && s.getStrokeWidth() ? s.getStrokeWidth() : ChartConstants.DEFAULT_CHART_STROKE_WIDTH)) - .style("fill", "none") - .attr("width", s.getWidth()) - .attr("height", s.getHeight()) - .append("g") - .attr("transform", "translate(" + margin.left + "," + margin.top + ")"); - var xMin; - var xMax; - var yMin; - var yMax; - if (this.setXMin != null) - xMin = this.setXMin; - else - xMin = (this.xData ? TSUtils.min(this.xData) : 0); - if (this.setXMax != null) - xMax = this.setXMax; - else - xMax = (this.xData ? TSUtils.max(this.xData) : 1); - if (this.setYMin != null) - yMin = this.setYMin; - else - yMin = (this.yData ? TSUtils.min(this.yData) : 0); - if (this.setYMax != null) - yMax = this.setYMax; - else - yMax = (this.yData ? TSUtils.max(this.yData) : 1); - xScale.domain([xMin, xMax]); - yScale.domain([yMin, yMax]); - var defaultColor = d3.scale.category10(); - for (var i = 0; i < nSeries; i++) { - var xVals = this.xData[i]; - var yVals = this.yData[i]; - var data = xVals.map(function (d, i) { - return { 'xPos': xVals[i], 'yPos': yVals[i] }; - }); - svg.append("path") - .attr("class", "line") - .style("stroke", (s && s.getSeriesColor(i) ? s.getSeriesColor(i) : defaultColor(String(i)))) - .attr("d", valueline(data)); - } - var xAxisNode = svg.append("g") - .attr("class", "x axis") - .attr("transform", "translate(0," + margin.heightExMargins + ")") - .style("stroke", "#000") - .style("stroke-width", (s != null && s.getAxisStrokeWidth() != null ? s.getAxisStrokeWidth() : ChartConstants.DEFAULT_AXIS_STROKE_WIDTH)) - .style("fill", "none") - .call(xAxis); - xAxisNode.selectAll('text').style("stroke-width", 0).style("fill", "#000000"); - if (this.gridVerticalStrokeWidth != null) - xAxisNode.selectAll('.axis line').style({ 'stroke-width': this.gridVerticalStrokeWidth }); - var yAxisNode = svg.append("g") - .attr("class", "y axis") - .style("stroke", "#000") - .style("stroke-width", (s != null && s.getAxisStrokeWidth() != null ? s.getAxisStrokeWidth() : ChartConstants.DEFAULT_AXIS_STROKE_WIDTH)) - .style("fill", "none") - .call(yAxis); - yAxisNode.selectAll('text').style("stroke-width", 0).style("fill", "#000000"); - if (this.gridHorizontalStrokeWidth != null) - yAxisNode.selectAll('.axis line').style({ 'stroke-width': this.gridHorizontalStrokeWidth }); - if (this.seriesNames && this.showLegend === true) { - var legendSpace = margin.widthExMargins / i; - for (var i = 0; i < nSeries; i++) { - var values = this.xData[i]; - var yValues = this.yData[i]; - var lastX = values[values.length - 1]; - var lastY = yValues[yValues.length - 1]; - var toDisplay = this.seriesNames[i]; - svg.append("text") - .attr("x", (legendSpace / 2) + i * legendSpace) - .attr("y", margin.heightExMargins + (margin.bottom / 2) + 5) - .attr("class", "legend") - .style("fill", (s && s.getSeriesColor(i) ? s.getSeriesColor(i) : defaultColor(String(i)))) - .text(toDisplay); - } - } - if (this.title) { - var titleStyle; - if (this.style) - titleStyle = this.style.getTitleStyle(); - Chart.appendTitle(svg, this.title, margin, titleStyle); - } - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.ChartLine]]; - this.xData = json['x']; - this.yData = json['y']; - this.seriesNames = json['seriesNames']; - } - return ChartLine; -}(Chart)); -var ChartScatter = (function (_super) { - __extends(ChartScatter, _super); - function ChartScatter(jsonStr) { - _super.call(this, ComponentType.ChartScatter, jsonStr); - this.render = function (appendToObject) { - var nSeries = (!this.xData ? 0 : this.xData.length); - var s = this.getStyle(); - var margin = Style.getMargins(s); - var xScale = d3.scale.linear().range([0, margin.widthExMargins]); - var yScale = d3.scale.linear().range([margin.heightExMargins, 0]); - var xAxis = d3.svg.axis().scale(xScale) - .innerTickSize(-margin.heightExMargins) - .orient("bottom").ticks(5); - var yAxis = d3.svg.axis().scale(yScale) - .innerTickSize(-margin.widthExMargins) - .orient("left").ticks(5); - if (this.suppressAxisHorizontal === true) - xAxis.tickValues([]); - if (this.suppressAxisVertical === true) - yAxis.tickValues([]); - var svg = d3.select("#" + appendToObject.attr("id")) - .append("svg") - .style("stroke-width", (s && s.getStrokeWidth() ? s.getStrokeWidth() : 1)) - .style("fill", "none") - .attr("width", s.getWidth()) - .attr("height", s.getHeight()) - .attr("padding", "20px") - .append("g") - .attr("transform", "translate(" + margin.left + "," + margin.top + ")"); - var xMin; - var xMax; - var yMin; - var yMax; - if (this.setXMin) - xMin = this.setXMin; - else - xMin = (this.xData ? TSUtils.min(this.xData) : 0); - if (this.setXMax) - xMax = this.setXMax; - else - xMax = (this.xData ? TSUtils.max(this.xData) : 1); - if (this.setYMin) - yMin = this.setYMin; - else - yMin = (this.yData ? TSUtils.min(this.yData) : 0); - if (this.setYMax) - yMax = this.setYMax; - else - yMax = (this.yData ? TSUtils.max(this.yData) : 1); - xScale.domain([xMin, xMax]); - yScale.domain([yMin, yMax]); - var defaultColor = d3.scale.category10(); - for (var i = 0; i < nSeries; i++) { - var xVals = this.xData[i]; - var yVals = this.yData[i]; - var data = xVals.map(function (d, i) { - return { 'xPos': xVals[i], 'yPos': yVals[i] }; - }); - svg.selectAll("circle") - .data(data) - .enter() - .append("circle") - .style("fill", (s && s.getSeriesColor(i) ? s.getSeriesColor(i) : defaultColor(String(i)))) - .attr("r", (s && s.getPointSize() ? s.getPointSize() : ChartConstants.DEFAULT_CHART_POINT_SIZE)) - .attr("cx", function (d) { - return xScale(d['xPos']); - }) - .attr("cy", function (d) { - return yScale(d['yPos']); - }); - } - var xAxisNode = svg.append("g") - .attr("class", "x axis") - .attr("transform", "translate(0," + margin.heightExMargins + ")") - .style("stroke", "#000") - .style("stroke-width", (s != null && s.getAxisStrokeWidth() != null ? s.getAxisStrokeWidth() : ChartConstants.DEFAULT_AXIS_STROKE_WIDTH)) - .style("fill", "none") - .call(xAxis); - xAxisNode.selectAll('text').style("stroke-width", 0).style("fill", "#000000"); - if (this.gridVerticalStrokeWidth != null) - xAxisNode.selectAll('.axis line').style({ 'stroke-width': this.gridVerticalStrokeWidth }); - var yAxisNode = svg.append("g") - .attr("class", "y axis") - .style("stroke", "#000") - .style("stroke-width", (s != null && s.getAxisStrokeWidth() != null ? s.getAxisStrokeWidth() : ChartConstants.DEFAULT_AXIS_STROKE_WIDTH)) - .style("fill", "none") - .call(yAxis); - yAxisNode.selectAll('text').style("stroke-width", 0).style("fill", "#000000"); - if (this.gridHorizontalStrokeWidth != null) - yAxisNode.selectAll('.axis line').style({ 'stroke-width': this.gridHorizontalStrokeWidth }); - if (this.seriesNames && this.showLegend === true) { - var legendSpace = margin.widthExMargins / i; - for (var i = 0; i < nSeries; i++) { - var values = this.xData[i]; - var yValues = this.yData[i]; - var lastX = values[values.length - 1]; - var lastY = yValues[yValues.length - 1]; - var toDisplay; - if (!lastX || !lastY) - toDisplay = this.seriesNames[i] + " (no data)"; - else - toDisplay = this.seriesNames[i] + " (" + lastX.toPrecision(5) + "," + lastY.toPrecision(5) + ")"; - svg.append("text") - .attr("x", (legendSpace / 2) + i * legendSpace) - .attr("y", margin.heightExMargins + (margin.bottom / 2) + 5) - .attr("class", "legend") - .style("fill", (s && s.getSeriesColor(i) ? s.getSeriesColor(i) : defaultColor(String(i)))) - .text(toDisplay); - } - } - if (this.title) { - var titleStyle; - if (this.style) - titleStyle = this.style.getTitleStyle(); - Chart.appendTitle(svg, this.title, margin, titleStyle); - } - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.ChartScatter]]; - this.xData = json['x']; - this.yData = json['y']; - this.seriesNames = json['seriesNames']; - } - return ChartScatter; -}(Chart)); -var Legend = (function () { - function Legend() { - } - Legend.offsetX = 15; - Legend.offsetY = 15; - Legend.padding = 8; - Legend.separation = 12; - Legend.boxSize = 10; - Legend.fillColor = "#FFFFFF"; - Legend.legendOpacity = 0.75; - Legend.borderStrokeColor = "#000000"; - Legend.legendFn = (function (g) { - var svg = d3.select(g.property("nearestViewportElement")); - var legendBox = g.selectAll(".outerRect").data([true]); - var legendItems = g.selectAll(".legendElement").data([true]); - legendBox.enter().append("rect").attr("class", "outerRect"); - legendItems.enter().append("g").attr("class", "legendElement"); - var legendElements = []; - svg.selectAll("[data-legend]").each(function () { - var thisVar = d3.select(this); - legendElements.push({ - label: thisVar.attr("data-legend"), - color: thisVar.style("fill") - }); - }); - legendItems.selectAll("rect") - .data(legendElements, function (d) { return d.label; }) - .call(function (d) { d.enter().append("rect"); }) - .call(function (d) { d.exit().remove(); }) - .attr("x", 0) - .attr("y", function (d, i) { return i * Legend.separation - Legend.boxSize + "px"; }) - .attr("width", Legend.boxSize) - .attr("height", Legend.boxSize) - .style("fill", function (d) { return d.color; }); - legendItems.selectAll("text") - .data(legendElements, function (d) { return d.label; }) - .call(function (d) { d.enter().append("text"); }) - .call(function (d) { d.exit().remove(); }) - .attr("y", function (d, i) { return i * Legend.separation + "px"; }) - .attr("x", (Legend.padding + Legend.boxSize) + "px") - .text(function (d) { return d.label; }); - var legendBoundingBox = legendItems[0][0].getBBox(); - legendBox.attr("x", (legendBoundingBox.x - Legend.padding)) - .attr("y", (legendBoundingBox.y - Legend.padding)) - .attr("height", (legendBoundingBox.height + 2 * Legend.padding)) - .attr("width", (legendBoundingBox.width + 2 * Legend.padding)) - .style("fill", Legend.fillColor) - .style("stroke", Legend.borderStrokeColor) - .style("opacity", Legend.legendOpacity); - svg.selectAll(".legend").attr("transform", "translate(" + Legend.offsetX + "," + Legend.offsetY + ")"); - }); - return Legend; -}()); -var ChartStackedArea = (function (_super) { - __extends(ChartStackedArea, _super); - function ChartStackedArea(jsonStr) { - _super.call(this, ComponentType.ChartStackedArea, jsonStr); - this.render = function (appendToObject) { - var nSeries = (!this.xData ? 0 : this.xData.length); - var s = this.getStyle(); - var margin = Style.getMargins(s); - var xScale = d3.scale.linear().range([0, margin.widthExMargins]); - var yScale = d3.scale.linear().range([margin.heightExMargins, 0]); - var xAxis = d3.svg.axis().scale(xScale) - .orient("bottom").ticks(5); - if (this.gridVerticalStrokeWidth != null && this.gridVerticalStrokeWidth > 0) { - xAxis.innerTickSize(-margin.heightExMargins); - } - var yAxis = d3.svg.axis().scale(yScale) - .orient("left").ticks(5); - if (this.gridHorizontalStrokeWidth != null && this.gridHorizontalStrokeWidth > 0) { - yAxis.innerTickSize(-margin.widthExMargins); - } - if (this.suppressAxisHorizontal === true) - xAxis.tickValues([]); - if (this.suppressAxisVertical === true) - yAxis.tickValues([]); - var data = []; - for (var i = 0; i < this.xData.length; i++) { - var obj = {}; - for (var j = 0; j < this.labels.length; j++) { - obj[this.labels[j]] = this.yData[j][i]; - obj['xValue'] = this.xData[i]; - } - data.push(obj); - } - var area = d3.svg.area() - .x(function (d) { return xScale(d.xValue); }) - .y0(function (d) { return yScale(d.y0); }) - .y1(function (d) { return yScale(d.y0 + d.y); }); - var stack = d3.layout.stack() - .values(function (d) { return d.values; }); - var svg = d3.select("#" + appendToObject.attr("id")).append("svg") - .attr("width", margin.widthExMargins + margin.left + margin.right) - .attr("height", margin.heightExMargins + margin.top + margin.bottom) - .append("g") - .attr("transform", "translate(" + margin.left + "," + margin.top + ")"); - var color = d3.scale.category20(); - color.domain(d3.keys(data[0]).filter(function (key) { - return key !== "xValue"; - })); - var browsers = stack(color.domain().map(function (name) { - return { - name: name, - values: data.map(function (d) { - return { xValue: d.xValue, y: d[name] * 1 }; - }) - }; - })); - var maxX = d3.max(data, function (d) { - var vals = d3.keys(d).map(function (key) { - return key !== "xValue" ? d[key] : 0; - }); - return d3.sum(vals); - }); - xScale.domain(d3.extent(data, function (d) { - return d.xValue; - })); - yScale.domain([0, maxX]); - var browser = svg.selectAll(".browser") - .data(browsers) - .enter().append("g") - .attr("class", "browser"); - var tempLabels = this.labels; - var defaultColor = d3.scale.category20(); - browser.append("path") - .attr("class", "area") - .attr("data-legend", function (d) { return d.name; }) - .attr("d", function (d) { - return area(d.values); - }) - .style("fill", function (d) { - if (s && s.getSeriesColor(tempLabels.indexOf(d.name))) { - return s.getSeriesColor(tempLabels.indexOf(d.name)); - } - else { - return defaultColor(String(tempLabels.indexOf(d.name))); - } - }) - .style({ "stroke-width": "0px" }); - var xAxisNode = svg.append("g") - .attr("class", "x axis") - .style("stroke", "#000") - .style("stroke-width", (s != null && s.getAxisStrokeWidth() != null ? s.getAxisStrokeWidth() : ChartConstants.DEFAULT_AXIS_STROKE_WIDTH)) - .style("fill", "none") - .attr("transform", "translate(0," + margin.heightExMargins + ")") - .call(xAxis); - xAxisNode.selectAll('text').style("stroke-width", 0).style("fill", "#000000"); - var yAxisNode = svg.append("g") - .attr("class", "y axis") - .style("stroke", "#000") - .style("stroke-width", (s != null && s.getAxisStrokeWidth() != null ? s.getAxisStrokeWidth() : ChartConstants.DEFAULT_AXIS_STROKE_WIDTH)) - .style("fill", "none") - .call(yAxis); - yAxisNode.selectAll('text').style("stroke-width", 0).style("fill", "#000000"); - if (this.title) { - var titleStyle; - if (this.style) - titleStyle = this.style.getTitleStyle(); - Chart.appendTitle(svg, this.title, margin, titleStyle); - } - var legend = svg.append("g") - .attr("class", "legend") - .attr("transform", "translate(40,40)") - .style("font-size", "12px") - .call(Legend.legendFn); - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.ChartStackedArea]]; - this.xData = json['x']; - this.yData = json['y']; - this.labels = json['labels']; - } - return ChartStackedArea; -}(Chart)); -var ChartTimeline = (function (_super) { - __extends(ChartTimeline, _super); - function ChartTimeline(jsonStr) { - _super.call(this, ComponentType.ChartTimeline, jsonStr); - this.render = function (appendToObject) { - var instance = this; - var s = this.getStyle(); - var margin = Style.getMargins(s); - this.itemData = []; - var count = 0; - for (var i = 0; i < this.laneData.length; i++) { - for (var j = 0; j < this.laneData[i].length; j++) { - var obj = {}; - obj["start"] = this.laneData[i][j]["startTimeMs"]; - obj["end"] = this.laneData[i][j]["endTimeMs"]; - obj["id"] = count++; - obj["lane"] = i; - obj["color"] = this.laneData[i][j]["color"]; - obj["label"] = this.laneData[i][j]["entryLabel"]; - this.itemData.push(obj); - } - } - this.lanes = []; - for (var i = 0; i < this.laneNames.length; i++) { - var obj = {}; - obj["label"] = this.laneNames[i]; - obj["id"] = i; - this.lanes.push(obj); - } - var svg = d3.select("#" + appendToObject.attr("id")) - .append("svg") - .style("stroke-width", (s && s.getStrokeWidth() ? s.getStrokeWidth() : ChartConstants.DEFAULT_CHART_STROKE_WIDTH)) - .style("fill", "none") - .attr("width", s.getWidth()) - .attr("height", s.getHeight()) - .append("g"); - var heightExMargins = s.getHeight() - margin.top - margin.bottom; - var widthExMargins = s.getWidth() - margin.left - margin.right; - var miniHeight = this.laneNames.length * ChartTimeline.MINI_LANE_HEIGHT_PX; - var mainHeight = s.getHeight() - miniHeight - margin.top - margin.bottom - 25; - var minTime = d3.min(this.itemData, function (d) { return d.start; }); - var maxTime = d3.max(this.itemData, function (d) { return d.end; }); - this.x = d3.time.scale() - .domain([minTime, maxTime]) - .range([0, widthExMargins]); - this.x1 = d3.time.scale().range([0, widthExMargins]); - this.y1 = d3.scale.linear().domain([0, this.laneNames.length]).range([0, mainHeight]); - this.y2 = d3.scale.linear().domain([0, this.laneNames.length]).range([0, miniHeight]); - this.rect = svg.append('defs').append('clipPath') - .attr('id', 'clip') - .append('rect') - .attr('width', widthExMargins) - .attr('height', s.getHeight() - 100); - this.mainView = svg.append('g') - .attr('transform', 'translate(' + margin.left + ',' + margin.top + ')') - .attr('width', widthExMargins) - .attr('height', mainHeight) - .attr('font-size', '12px') - .attr('font', 'sans-serif'); - this.miniView = svg.append('g') - .attr('transform', 'translate(' + margin.left + ',' + (mainHeight + margin.top + 25) + ')') - .attr('width', widthExMargins) - .attr('height', miniHeight) - .attr('font-size', '10px') - .attr('font', 'sans-serif'); - this.mainView.append('g').selectAll('.laneLines') - .data(this.lanes) - .enter().append('line') - .attr('x1', 0) - .attr('y1', function (d) { - return d3.round(instance.y1(d.id)) + 0.5; - }) - .attr('x2', widthExMargins) - .attr('y2', function (d) { - return d3.round(instance.y1(d.id)) + 0.5; - }) - .attr('stroke', 'lightgray') - .attr('stroke-width', 1); - this.mainView.append('g').selectAll('.laneText') - .data(this.lanes) - .enter().append('text') - .text(function (d) { - if (d.label) - return d.label; - return ""; - }) - .attr('x', -10) - .attr('y', function (d) { - return instance.y1(d.id + .5); - }) - .attr('text-anchor', 'end') - .attr("font", "8pt sans-serif") - .attr('fill', 'black'); - this.miniView.append('g').selectAll('.laneLines') - .data(this.lanes) - .enter().append('line') - .attr('x1', 0) - .attr('y1', function (d) { return d3.round(instance.y2(d.id)) + 0.5; }) - .attr('x2', widthExMargins) - .attr('y2', function (d) { return d3.round(instance.y2(d.id)) + 0.5; }) - .attr('stroke', 'gray') - .attr('stroke-width', 1.0); - this.miniView.append('g').selectAll('.laneText') - .data(this.lanes) - .enter().append('text') - .text(function (d) { - if (d.label) - return d.label; - return ""; - }) - .attr('x', -10) - .attr('y', function (d) { - return instance.y2(d.id + .5); - }) - .attr('dy', '0.5ex') - .attr('text-anchor', 'end') - .attr('fill', 'black'); - this.xTimeAxis = d3.svg.axis() - .scale(this.x1) - .orient('bottom') - .ticks(d3.time.days, 1) - .tickFormat(d3.time.format('%a %d')) - .tickSize(6, 0); - var temp = this.mainView.append('g') - .attr('transform', 'translate(0,' + mainHeight + ')') - .attr('class', 'timeAxis') - .attr('fill', 'black') - .style("stroke", "black").style("stroke-width", 1.0).style("fill", "black") - .attr("font", "10px sans-serif") - .call(this.xTimeAxis); - temp.selectAll('text').style("stroke-width", 0.0).attr('stroke-width', 0.0); - this.itemRects = this.mainView.append('g') - .attr('clip-path', 'url(#clip)'); - this.miniView.append('g').selectAll('miniItems') - .data(this.getMiniViewPaths(this.itemData)) - .enter().append('path') - .attr('class', function (d) { - return 'miniItem ' + d.class; - }) - .attr('d', function (d) { - return d.path; - }) - .attr('stroke', 'black') - .attr('stroke-width', 'black'); - this.miniView.append('rect') - .attr('pointer-events', 'painted') - .attr('width', widthExMargins) - .attr('height', miniHeight) - .attr('visibility', 'hidden') - .on('mouseup', this.moveBrush); - this.brush = d3.svg.brush() - .x(this.x) - .extent([minTime, maxTime]) - .on("brush", this.renderChart); - this.miniView.append('g') - .attr('class', 'x brush') - .call(this.brush) - .selectAll('rect') - .attr('y', 1) - .attr('height', miniHeight - 1) - .style('fill', 'gray') - .style('fill-opacity', '0.2') - .style('stroke', 'DarkSlateGray') - .style('stroke-width', 1); - this.miniView.selectAll('rect.background').remove(); - this.renderChart(); - if (this.title) { - var titleStyle; - if (this.style) - titleStyle = this.style.getTitleStyle(); - var text = svg.append("text") - .text(this.title) - .attr("x", (s.getWidth() / 2)) - .attr("y", ((margin.top - 30) / 2)) - .attr("text-anchor", "middle"); - if (titleStyle) { - if (titleStyle.getFont()) - text.attr("font-family", titleStyle.getFont); - if (titleStyle.getFontSize() != null) - text.attr("font-size", titleStyle.getFontSize() + "pt"); - if (titleStyle.getUnderline() != null) - text.style("text-decoration", "underline"); - if (titleStyle.getColor()) - text.style("fill", titleStyle.getColor); - else - text.style("fill", ChartConstants.DEFAULT_TITLE_COLOR); - } - else { - text.style("text-decoration", "underline"); - text.style("fill", ChartConstants.DEFAULT_TITLE_COLOR); - } - } - }; - this.renderChart = function () { - var instance = this; - var extent = this.brush.extent(); - var minExtent = extent[0]; - var maxExtent = extent[1]; - var visibleItems = this.itemData.filter(function (d) { - return d.start < maxExtent && d.end > minExtent; - }); - this.miniView.select('.brush').call(this.brush.extent([minExtent, maxExtent])); - this.x1.domain([minExtent, maxExtent]); - var range = maxExtent - minExtent; - if (range > 2 * ChartTimeline.MILLISEC_PER_WEEK) { - this.xTimeAxis.ticks(d3.time.mondays, 1).tickFormat(d3.time.format('%a %d')); - } - else if (range > 2 * ChartTimeline.MILLISEC_PER_DAY) { - this.xTimeAxis.ticks(d3.time.days, 1).tickFormat(d3.time.format('%a %d')); - } - else if (range > 2 * ChartTimeline.MILLISEC_PER_HOUR) { - this.xTimeAxis.ticks(d3.time.hours, 4).tickFormat(d3.time.format('%H %p')); - } - else if (range > 2 * ChartTimeline.MILLISEC_PER_MINUTE) { - this.xTimeAxis.ticks(d3.time.minutes, 1).tickFormat(d3.time.format('%H:%M')); - } - else if (range >= 30000) { - this.xTimeAxis.ticks(d3.time.seconds, 10).tickFormat(d3.time.format('%H:%M:%S')); - } - else { - this.xTimeAxis.ticks(d3.time.seconds, 1).tickFormat(d3.time.format('%H:%M:%S')); - } - this.mainView.select('.timeAxis').call(this.xTimeAxis); - var rects = this.itemRects.selectAll('rect') - .data(visibleItems, function (d) { return d.id; }) - .attr('x', function (d) { return instance.x1(d.start); }) - .attr('width', function (d) { return instance.x1(d.end) - instance.x1(d.start); }); - rects.enter().append('rect') - .attr('x', function (d) { return instance.x1(d.start); }) - .attr('y', function (d) { return instance.y1(d.lane) + ChartTimeline.ENTRY_LANE_HEIGHT_OFFSET_FRACTION * instance.y1(1) + 0.5; }) - .attr('width', function (d) { return instance.x1(d.end) - instance.x1(d.start); }) - .attr('height', function (d) { return ChartTimeline.ENTRY_LANE_HEIGHT_TOTAL_FRACTION * instance.y1(1); }) - .attr('stroke', 'black') - .attr('fill', function (d) { - if (d.color) - return d.color; - return ChartTimeline.DEFAULT_COLOR; - }) - .attr('stroke-width', 1); - rects.exit().remove(); - var labels = this.itemRects.selectAll('text') - .data(visibleItems, function (d) { - return d.id; - }) - .attr('x', function (d) { - return instance.x1(Math.max(d.start, minExtent)) + 2; - }) - .attr('fill', 'black'); - labels.enter().append('text') - .text(function (d) { - if (instance.x1(d.end) - instance.x1(d.start) <= 30) - return ""; - if (d.label) - return d.label; - return ""; - }) - .attr('x', function (d) { - return instance.x1(Math.max(d.start, minExtent)) + 2; - }) - .attr('y', function (d) { - return instance.y1(d.lane) + .4 * instance.y1(1) + 0.5; - }) - .attr('text-anchor', 'start') - .attr('class', 'itemLabel') - .attr('fill', 'black'); - labels.exit().remove(); - }; - this.moveBrush = function () { - var origin = d3.mouse(this.rect[0]); - var time = this.x.invert(origin[0]).getTime(); - var halfExtent = (this.brush.extent()[1].getTime() - this.brush.extent()[0].getTime()) / 2; - this.brush.extent([new Date(time - halfExtent), new Date(time + halfExtent)]); - this.renderChart(); - }; - this.getMiniViewPaths = function (items) { - var paths = {}, d, offset = .5 * this.y2(1) + 0.5, result = []; - for (var i = 0; i < items.length; i++) { - d = items[i]; - if (!paths[d.class]) - paths[d.class] = ''; - paths[d.class] += ['M', this.x(d.start), (this.y2(d.lane) + offset), 'H', this.x(d.end)].join(' '); - } - for (var className in paths) { - result.push({ class: className, path: paths[className] }); - } - return result; - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.ChartTimeline]]; - this.laneNames = json['laneNames']; - this.laneData = json['laneData']; - } - ChartTimeline.MINI_LANE_HEIGHT_PX = 12; - ChartTimeline.ENTRY_LANE_HEIGHT_OFFSET_FRACTION = 0.05; - ChartTimeline.ENTRY_LANE_HEIGHT_TOTAL_FRACTION = 0.90; - ChartTimeline.MILLISEC_PER_MINUTE = 60 * 1000; - ChartTimeline.MILLISEC_PER_HOUR = 60 * ChartTimeline.MILLISEC_PER_MINUTE; - ChartTimeline.MILLISEC_PER_DAY = 24 * ChartTimeline.MILLISEC_PER_HOUR; - ChartTimeline.MILLISEC_PER_WEEK = 7 * ChartTimeline.MILLISEC_PER_DAY; - ChartTimeline.DEFAULT_COLOR = "LightGrey"; - return ChartTimeline; -}(Chart)); -var StyleChart = (function (_super) { - __extends(StyleChart, _super); - function StyleChart(jsonObj) { - var _this = this; - _super.call(this, jsonObj['StyleChart']); - this.getStrokeWidth = function () { return _this.strokeWidth; }; - this.getPointSize = function () { return _this.pointSize; }; - this.getSeriesColors = function () { return _this.seriesColors; }; - this.getSeriesColor = function (idx) { - if (!this.seriesColors || idx < 0 || idx > this.seriesColors.length) - return null; - return _this.seriesColors[idx]; - }; - this.getAxisStrokeWidth = function () { return _this.axisStrokeWidth; }; - this.getTitleStyle = function () { return _this.titleStyle; }; - var style = jsonObj['StyleChart']; - if (style) { - this.strokeWidth = style['strokeWidth']; - this.pointSize = style['pointSize']; - this.seriesColors = style['seriesColors']; - if (style['titleStyle']) - this.titleStyle = new StyleText(style['titleStyle']); - } - } - return StyleChart; -}(Style)); -var ComponentDiv = (function (_super) { - __extends(ComponentDiv, _super); - function ComponentDiv(jsonStr) { - _super.call(this, ComponentType.ComponentDiv); - this.render = function (appendToObject) { - var newDiv = $('
'); - newDiv.uniqueId(); - if (this.style) { - if (this.style.getWidth()) { - var unit = this.style.getWidthUnit(); - newDiv.width(this.style.getWidth() + (unit ? unit : "")); - } - if (this.style.getHeight()) { - var unit = this.style.getHeightUnit(); - newDiv.height(this.style.getHeight() + (unit ? unit : "")); - } - if (this.style.getBackgroundColor()) - newDiv.css("background-color", this.style.getBackgroundColor()); - if (this.style.getFloatValue()) - newDiv.css("float", this.style.getFloatValue()); - } - appendToObject.append(newDiv); - if (this.components) { - for (var i = 0; i < this.components.length; i++) { - this.components[i].render(newDiv); - } - } - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.ComponentDiv]]; - var components = json['components']; - if (components) { - this.components = []; - for (var i = 0; i < components.length; i++) { - var asStr = JSON.stringify(components[i]); - this.components.push(Component.getComponent(asStr)); - } - } - if (json['style']) - this.style = new StyleDiv(json['style']); - } - return ComponentDiv; -}(Component)); -var StyleDiv = (function (_super) { - __extends(StyleDiv, _super); - function StyleDiv(jsonObj) { - var _this = this; - _super.call(this, jsonObj['StyleDiv']); - this.getFloatValue = function () { return _this.floatValue; }; - if (jsonObj && jsonObj['StyleDiv']) - this.floatValue = jsonObj['StyleDiv']['floatValue']; - } - return StyleDiv; -}(Style)); -var DecoratorAccordion = (function (_super) { - __extends(DecoratorAccordion, _super); - function DecoratorAccordion(jsonStr) { - _super.call(this, ComponentType.DecoratorAccordion); - this.render = function (appendToObject) { - var s = this.style; - var outerDiv = $('
'); - outerDiv.uniqueId(); - var titleDiv; - if (this.title) - titleDiv = $('
' + this.title + '
'); - else - titleDiv = $('
'); - titleDiv.uniqueId(); - outerDiv.append(titleDiv); - var innerDiv = $('
'); - innerDiv.uniqueId(); - outerDiv.append(innerDiv); - if (this.innerComponents) { - for (var i = 0; i < this.innerComponents.length; i++) { - this.innerComponents[i].render(innerDiv); - } - } - appendToObject.append(outerDiv); - if (this.defaultCollapsed) - outerDiv.accordion({ collapsible: true, heightStyle: "content", active: false }); - else - outerDiv.accordion({ collapsible: true, heightStyle: "content" }); - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.DecoratorAccordion]]; - this.title = json['title']; - this.defaultCollapsed = json['defaultCollapsed']; - var innerCs = json['innerComponents']; - if (innerCs) { - this.innerComponents = []; - for (var i = 0; i < innerCs.length; i++) { - var asStr = JSON.stringify(innerCs[i]); - this.innerComponents.push(Component.getComponent(asStr)); - } - } - if (json['style']) - this.style = new StyleAccordion(json['style']); - } - return DecoratorAccordion; -}(Component)); -var StyleAccordion = (function (_super) { - __extends(StyleAccordion, _super); - function StyleAccordion(jsonObj) { - _super.call(this, jsonObj['StyleAccordion']); - } - return StyleAccordion; -}(Style)); -var ComponentTable = (function (_super) { - __extends(ComponentTable, _super); - function ComponentTable(jsonStr) { - _super.call(this, ComponentType.ComponentTable); - this.render = function (appendToObject) { - var s = this.style; - var margin = Style.getMargins(s); - var tbl = document.createElement('table'); - tbl.style.width = '100%'; - if (s && s.getBorderWidthPx() != null) - tbl.setAttribute('border', String(s.getBorderWidthPx())); - if (s && s.getBackgroundColor()) - tbl.style.backgroundColor = s.getBackgroundColor(); - if (s && s.getWhitespaceMode()) - tbl.style.whiteSpace = s.getWhitespaceMode(); - if (s && s.getColumnWidths()) { - var colWidths = s.getColumnWidths(); - var unit = TSUtils.normalizeLengthUnit(s.getColumnWidthUnit()); - for (var i = 0; i < colWidths.length; i++) { - var col = document.createElement('col'); - col.setAttribute('width', colWidths[i] + unit); - tbl.appendChild(col); - } - } - var padTop = 1; - var padRight = 1; - var padBottom = 1; - var padLeft = 1; - if (this.header) { - var theader = document.createElement('thead'); - var headerRow = document.createElement('tr'); - if (s && s.getHeaderColor()) - headerRow.style.backgroundColor = s.getHeaderColor(); - for (var i = 0; i < this.header.length; i++) { - var headerd = document.createElement('th'); - headerd.style.padding = padTop + 'px ' + padRight + 'px ' + padBottom + 'px ' + padLeft + 'px'; - headerd.appendChild(document.createTextNode(this.header[i])); - headerRow.appendChild(headerd); - } - tbl.appendChild(headerRow); - } - if (this.content) { - var tbdy = document.createElement('tbody'); - for (var i = 0; i < this.content.length; i++) { - var tr = document.createElement('tr'); - for (var j = 0; j < this.content[i].length; j++) { - var td = document.createElement('td'); - td.style.padding = padTop + 'px ' + padRight + 'px ' + padBottom + 'px ' + padLeft + 'px'; - td.appendChild(document.createTextNode(this.content[i][j])); - tr.appendChild(td); - } - tbdy.appendChild(tr); - } - tbl.appendChild(tbdy); - } - appendToObject.append(tbl); - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.ComponentTable]]; - this.header = json['header']; - this.content = json['content']; - if (json['style']) - this.style = new StyleTable(json['style']); - } - return ComponentTable; -}(Component)); -var StyleTable = (function (_super) { - __extends(StyleTable, _super); - function StyleTable(jsonObj) { - var _this = this; - _super.call(this, jsonObj['StyleTable']); - this.getColumnWidths = function () { return _this.columnWidths; }; - this.getColumnWidthUnit = function () { return _this.columnWidthUnit; }; - this.getBorderWidthPx = function () { return _this.borderWidthPx; }; - this.getHeaderColor = function () { return _this.headerColor; }; - this.getWhitespaceMode = function () { return _this.whitespaceMode; }; - var style = jsonObj['StyleTable']; - if (style) { - this.columnWidths = jsonObj['StyleTable']['columnWidths']; - this.borderWidthPx = jsonObj['StyleTable']['borderWidthPx']; - this.headerColor = jsonObj['StyleTable']['headerColor']; - this.columnWidthUnit = jsonObj['StyleTable']['columnWidthUnit']; - this.whitespaceMode = jsonObj['StyleTable']['whitespaceMode']; - } - } - return StyleTable; -}(Style)); -var ComponentText = (function (_super) { - __extends(ComponentText, _super); - function ComponentText(jsonStr) { - var _this = this; - _super.call(this, ComponentType.ComponentText); - this.render = function (appendToObject) { - var textNode = document.createTextNode(_this.text); - if (_this.style) { - var newSpan = document.createElement('span'); - if (_this.style.getFont()) - newSpan.style.font = _this.style.getFont(); - if (_this.style.getFontSize() != null) - newSpan.style.fontSize = _this.style.getFontSize() + "pt"; - if (_this.style.getUnderline() != null) - newSpan.style.textDecoration = 'underline'; - if (_this.style.getColor()) - newSpan.style.color = _this.style.getColor(); - if (_this.style.getMarginTop()) - newSpan.style.marginTop = _this.style.getMarginTop() + "px"; - if (_this.style.getMarginBottom()) - newSpan.style.marginBottom = _this.style.getMarginBottom() + "px"; - if (_this.style.getMarginLeft()) - newSpan.style.marginLeft = _this.style.getMarginLeft() + "px"; - if (_this.style.getMarginRight()) - newSpan.style.marginRight = _this.style.getMarginRight() + "px"; - if (_this.style.getWhitespacePre()) - newSpan.style.whiteSpace = 'pre'; - newSpan.appendChild(textNode); - appendToObject.append(newSpan); - } - else { - var newSpan = document.createElement('span'); - newSpan.appendChild(textNode); - appendToObject.append(newSpan); - } - }; - var json = JSON.parse(jsonStr); - if (!json["componentType"]) - json = json[ComponentType[ComponentType.ComponentText]]; - this.text = json['text']; - if (json['style']) - this.style = new StyleText(json['style']); - } - return ComponentText; -}(Component)); -var StyleText = (function (_super) { - __extends(StyleText, _super); - function StyleText(jsonObj) { - var _this = this; - _super.call(this, jsonObj['StyleText']); - this.getFont = function () { return _this.font; }; - this.getFontSize = function () { return _this.fontSize; }; - this.getUnderline = function () { return _this.underline; }; - this.getColor = function () { return _this.color; }; - this.getWhitespacePre = function () { return _this.whitespacePre; }; - var style = jsonObj['StyleText']; - if (style) { - this.font = style['font']; - this.fontSize = style['fontSize']; - this.underline = style['underline']; - this.color = style['color']; - this.whitespacePre = style['whitespacePre']; - } - } - return StyleText; -}(Style)); -//# sourceMappingURL=dl4j-ui.js.map \ No newline at end of file diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/resources/deeplearning4jUiAssets/dl4j-ui.js.map b/contrib/attic/arbiter/arbiter-ui/src/main/resources/deeplearning4jUiAssets/dl4j-ui.js.map deleted file mode 100644 index 3545aed31..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/resources/deeplearning4jUiAssets/dl4j-ui.js.map +++ /dev/null @@ -1 +0,0 @@ -{"version":3,"file":"dl4j-ui.js","sourceRoot":"","sources":["../../typescript/org/deeplearning4j/ui/api/Style.ts","../../typescript/org/deeplearning4j/ui/api/ComponentType.ts","../../typescript/org/deeplearning4j/ui/api/Component.ts","../../typescript/org/deeplearning4j/ui/api/Constants.ts","../../typescript/org/deeplearning4j/ui/api/Margin.ts","../../typescript/org/deeplearning4j/ui/api/Renderable.ts","../../typescript/org/deeplearning4j/ui/util/TSUtils.ts","../../typescript/org/deeplearning4j/ui/components/chart/Chart.ts","../../typescript/org/deeplearning4j/ui/components/chart/ChartHistogram.ts","../../typescript/org/deeplearning4j/ui/components/chart/ChartLine.ts","../../typescript/org/deeplearning4j/ui/components/chart/ChartScatter.ts","../../typescript/org/deeplearning4j/ui/components/chart/Legend.ts","../../typescript/org/deeplearning4j/ui/components/chart/ChartStackedArea.ts","../../typescript/org/deeplearning4j/ui/components/chart/ChartTimeline.ts","../../typescript/org/deeplearning4j/ui/components/chart/style/StyleChart.ts","../../typescript/org/deeplearning4j/ui/components/component/ComponentDiv.ts","../../typescript/org/deeplearning4j/ui/components/component/style/StyleDiv.ts","../../typescript/org/deeplearning4j/ui/components/decorator/DecoratorAccordion.ts","../../typescript/org/deeplearning4j/ui/components/decorator/style/StyleAccordion.ts","../../typescript/org/deeplearning4j/ui/components/table/ComponentTable.ts","../../typescript/org/deeplearning4j/ui/components/table/style/StyleTable.ts","../../typescript/org/deeplearning4j/ui/components/text/ComponentText.ts","../../typescript/org/deeplearning4j/ui/components/text/style/StyleText.ts"],"names":[],"mappings":";;;;;AAkBA;IAcI,eAAa,OAAY;QAd7B,iBAmDC;QAzBG,aAAQ,GAAG,cAAM,OAAA,KAAI,CAAC,KAAK,EAAV,CAAU,CAAC;QAC5B,cAAS,GAAG,cAAM,OAAA,KAAI,CAAC,MAAM,EAAX,CAAW,CAAC;QAC9B,iBAAY,GAAG,cAAM,OAAA,KAAI,CAAC,SAAS,EAAd,CAAc,CAAC;QACpC,kBAAa,GAAG,cAAM,OAAA,KAAI,CAAC,UAAU,EAAf,CAAe,CAAC;QACtC,iBAAY,GAAG,cAAM,OAAA,KAAI,CAAC,SAAS,EAAd,CAAc,CAAC;QACpC,oBAAe,GAAG,cAAM,OAAA,KAAI,CAAC,YAAY,EAAjB,CAAiB,CAAC;QAC1C,kBAAa,GAAG,cAAM,OAAA,KAAI,CAAC,UAAU,EAAf,CAAe,CAAC;QACtC,mBAAc,GAAG,cAAM,OAAA,KAAI,CAAC,WAAW,EAAhB,CAAgB,CAAC;QACxC,uBAAkB,GAAG,cAAM,OAAA,KAAI,CAAC,eAAe,EAApB,CAAoB,CAAC;QAnB5C,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,OAAO,CAAC,CAAC;QAC9B,IAAI,CAAC,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC,CAAC;QAChC,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,mBAAmB,CAAC,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC;QACnE,IAAI,CAAC,UAAU,GAAG,OAAO,CAAC,mBAAmB,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC;QACrE,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,WAAW,CAAC,CAAC;QACtC,IAAI,CAAC,YAAY,GAAG,OAAO,CAAC,cAAc,CAAC,CAAC;QAC5C,IAAI,CAAC,UAAU,GAAG,OAAO,CAAC,YAAY,CAAC,CAAC;QACxC,IAAI,CAAC,WAAW,GAAG,OAAO,CAAC,aAAa,CAAC,CAAC;QAC1C,IAAI,CAAC,eAAe,GAAG,OAAO,CAAC,iBAAiB,CAAC,CAAC;IACtD,CAAC;IAaM,gBAAU,GAAjB,UAAkB,CAAQ;QACtB,IAAI,IAAI,GAAW,CAAC,CAAC,GAAG,CAAC,CAAC,YAAY,EAAE,GAAG,CAAC,CAAC,CAAC;QAC9C,IAAI,OAAO,GAAW,CAAC,CAAC,GAAG,CAAC,CAAC,eAAe,EAAE,GAAG,CAAC,CAAC,CAAC;QACpD,IAAI,KAAK,GAAW,CAAC,CAAC,GAAG,CAAC,CAAC,aAAa,EAAE,GAAG,CAAC,CAAC,CAAC;QAChD,IAAI,MAAM,GAAW,CAAC,CAAC,GAAG,CAAC,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC,CAAC;QAGlD,MAAM,CAAC,EAAC,GAAG,EAAE,IAAI;YACb,KAAK,EAAE,MAAM;YACb,MAAM,EAAE,OAAO;YACf,IAAI,EAAE,KAAK;YACX,cAAc,EAAE,CAAC,CAAC,QAAQ,EAAE,GAAG,KAAK,GAAG,MAAM;YAC7C,eAAe,EAAE,CAAC,CAAC,SAAS,EAAE,GAAG,IAAI,GAAG,OAAO,EAAC,CAAC;IACzD,CAAC;IACL,YAAC;AAAD,CAAC,AAnDD,IAmDC;ACjDD,IAAK,aAWJ;AAXD,WAAK,aAAa;IACd,mEAAa,CAAA;IACb,qEAAc,CAAA;IACd,iEAAY,CAAA;IACZ,qEAAc,CAAA;IACd,6EAAkB,CAAA;IAClB,2DAAS,CAAA;IACT,iEAAY,CAAA;IACZ,yEAAgB,CAAA;IAChB,mEAAa,CAAA;IACb,6EAAkB,CAAA;AACtB,CAAC,EAXI,aAAa,KAAb,aAAa,QAWjB;ACTD;IAII,mBAAY,aAA4B;QACpC,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;IACvC,CAAC;IAEM,oCAAgB,GAAvB;QACI,MAAM,CAAC,IAAI,CAAC,aAAa,CAAC;IAC9B,CAAC;IAKa,sBAAY,GAA1B,UAA2B,OAAe;QAEtC,IAAI,IAAI,GAAQ,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QACpC,IAAI,GAAW,CAAC;QAChB,EAAE,CAAA,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,GAAG,GAAG,IAAI,CAAC,eAAe,CAAC,CAAC;QACtD,IAAI;YAAC,GAAG,GAAG,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC;QAIhC,MAAM,CAAA,CAAC,GAAG,CAAC,CAAA,CAAC;YACR,KAAK,aAAa,CAAC,aAAa,CAAC,aAAa,CAAC;gBAC3C,MAAM,CAAC,IAAI,aAAa,CAAC,OAAO,CAAC,CAAC;YAEtC,KAAK,aAAa,CAAC,aAAa,CAAC,cAAc,CAAC;gBAC5C,MAAM,CAAC,IAAI,cAAc,CAAC,OAAO,CAAC,CAAC;YAEvC,KAAK,aAAa,CAAC,aAAa,CAAC,cAAc,CAAC;gBAC5C,MAAM,CAAC,IAAI,cAAc,CAAC,OAAO,CAAC,CAAC;YAEvC,KAAK,aAAa,CAAC,aAAa,CAAC,kBAAkB,CAAC;gBAChD,MAAM,IAAI,KAAK,CAAC,2CAA2C,CAAC,CAAC;YAEjE,KAAK,aAAa,CAAC,aAAa,CAAC,SAAS,CAAC;gBACvC,MAAM,CAAC,IAAI,SAAS,CAAC,OAAO,CAAC,CAAC;YAElC,KAAK,aAAa,CAAC,aAAa,CAAC,YAAY,CAAC;gBAC1C,MAAM,CAAC,IAAI,YAAY,CAAC,OAAO,CAAC,CAAC;YAErC,KAAK,aAAa,CAAC,aAAa,CAAC,gBAAgB,CAAC;gBAC9C,MAAM,CAAC,IAAI,gBAAgB,CAAC,OAAO,CAAC,CAAC;YAEzC,KAAK,aAAa,CAAC,aAAa,CAAC,aAAa,CAAC;gBAC3C,MAAM,CAAC,IAAI,aAAa,CAAC,OAAO,CAAC,CAAC;YAEtC,KAAK,aAAa,CAAC,aAAa,CAAC,kBAAkB,CAAC;gBAChD,MAAM,CAAC,IAAI,kBAAkB,CAAC,OAAO,CAAC,CAAC;YAE3C,KAAK,aAAa,CAAC,aAAa,CAAC,YAAY,CAAC;gBAC1C,MAAM,CAAC,IAAI,YAAY,CAAC,OAAO,CAAC,CAAC;YAErC;gBACI,MAAM,IAAI,KAAK,CAAC,2BAA2B,GAAG,GAAG,GAAG,wBAAwB,GAAG,OAAO,GAAG,IAAI,CAAC,CAAC;QACvG,CAAC;IACL,CAAC;IACL,gBAAC;AAAD,CAAC,AA3DD,IA2DC;AChED;IAAA;IAMA,CAAC;IAJU,yCAA0B,GAAG,GAAG,CAAC;IACjC,uCAAwB,GAAG,GAAG,CAAC;IAC/B,wCAAyB,GAAG,GAAG,CAAC;IAChC,kCAAmB,GAAG,SAAS,CAAC;IAC3C,qBAAC;AAAD,CAAC,AAND,IAMC;AGJD;IAAA;IA6CA,CAAC;IA1CU,WAAG,GAAV,UAAW,KAAiB;QACxB,IAAI,GAAG,GAAW,CAAC,MAAM,CAAC,SAAS,CAAC;QACpC,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;YACpC,GAAG,CAAA,CAAE,IAAI,CAAC,GAAC,CAAC,EAAE,CAAC,GAAC,KAAK,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;gBACnC,GAAG,GAAG,IAAI,CAAC,GAAG,CAAC,GAAG,EAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YACpC,CAAC;QACL,CAAC;QACD,MAAM,CAAC,GAAG,CAAC;IACf,CAAC;IAGM,WAAG,GAAV,UAAW,KAAiB;QACxB,IAAI,GAAG,GAAW,MAAM,CAAC,SAAS,CAAC;QACnC,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;YACpC,GAAG,CAAA,CAAE,IAAI,CAAC,GAAC,CAAC,EAAE,CAAC,GAAC,KAAK,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;gBACnC,GAAG,GAAG,IAAI,CAAC,GAAG,CAAC,GAAG,EAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YACpC,CAAC;QACL,CAAC;QACD,MAAM,CAAC,GAAG,CAAC;IACf,CAAC;IAGM,2BAAmB,GAA1B,UAA2B,KAAa;QACpC,EAAE,CAAA,CAAC,KAAK,IAAI,IAAI,CAAC;YAAC,MAAM,CAAC,KAAK,CAAC;QAE/B,MAAM,CAAA,CAAC,KAAK,CAAC,WAAW,EAAE,CAAC,CAAA,CAAC;YACxB,KAAK,IAAI;gBACL,MAAM,CAAC,IAAI,CAAC;YAChB,KAAK,SAAS,CAAC;YACf,KAAK,GAAG;gBACJ,MAAM,CAAC,GAAG,CAAC;YACf,KAAK,IAAI;gBACL,MAAM,CAAC,IAAI,CAAC;YAChB,KAAK,IAAI;gBACL,MAAM,CAAC,IAAI,CAAC;YAChB,KAAK,IAAI;gBACL,MAAM,CAAC,IAAI,CAAC;YAChB;gBACI,MAAM,CAAC,KAAK,CAAC;QACrB,CAAC;IAEL,CAAC;IACL,cAAC;AAAD,CAAC,AA7CD,IA6CC;ACxCD;IAA6B,yBAAS;IAiBlC,eAAY,aAA4B,EAAE,OAAe;QACrD,kBAAM,aAAa,CAAC,CAAC;QAErB,IAAI,QAAQ,GAAQ,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QACxC,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,CAAC,CAAC;QAErE,IAAI,CAAC,sBAAsB,GAAG,IAAI,CAAC,wBAAwB,CAAC,CAAC;QAC7D,IAAI,CAAC,oBAAoB,GAAG,IAAI,CAAC,sBAAsB,CAAC,CAAC;QACzD,IAAI,CAAC,UAAU,GAAG,IAAI,CAAC,YAAY,CAAC,CAAC;QAErC,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,OAAO,CAAC,CAAC;QAC3B,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC;QAC/B,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC;QAC/B,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC;QAC/B,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC;QAE/B,IAAI,CAAC,uBAAuB,GAAG,IAAI,CAAC,yBAAyB,CAAC,CAAC;QAC/D,IAAI,CAAC,yBAAyB,GAAG,IAAI,CAAC,2BAA2B,CAAC,CAAC;QAEnE,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;YAAC,IAAI,CAAC,KAAK,GAAG,IAAI,UAAU,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC;IACjE,CAAC;IAED,wBAAQ,GAAR;QACI,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC;IACtB,CAAC;IAEgB,iBAAW,GAA5B,UAA6B,GAAQ,EAAE,KAAa,EAAE,MAAc,EAAE,UAAqB;QACvF,IAAI,IAAI,GAAG,GAAG,CAAC,MAAM,CAAC,MAAM,CAAC;aACxB,IAAI,CAAC,KAAK,CAAC;aACX,IAAI,CAAC,GAAG,EAAE,CAAC,MAAM,CAAC,cAAc,GAAG,CAAC,CAAC,CAAC;aACtC,IAAI,CAAC,GAAG,EAAE,CAAC,GAAG,CAAC,CAAC,MAAM,CAAC,GAAG,GAAG,EAAE,CAAC,GAAG,CAAC,CAAC,CAAC;aACtC,IAAI,CAAC,aAAa,EAAE,QAAQ,CAAC,CAAC;QAEnC,EAAE,CAAA,CAAC,UAAU,CAAC,CAAA,CAAC;YACX,EAAE,CAAA,CAAC,UAAU,CAAC,OAAO,EAAE,CAAC;gBAAC,IAAI,CAAC,IAAI,CAAC,aAAa,EAAC,UAAU,CAAC,OAAO,CAAC,CAAC;YACrE,EAAE,CAAA,CAAC,UAAU,CAAC,WAAW,EAAE,IAAI,IAAI,CAAC;gBAAC,IAAI,CAAC,IAAI,CAAC,WAAW,EAAC,UAAU,CAAC,WAAW,EAAE,GAAC,IAAI,CAAC,CAAC;YAC1F,EAAE,CAAA,CAAC,UAAU,CAAC,YAAY,EAAE,IAAI,IAAI,CAAC;gBAAC,IAAI,CAAC,KAAK,CAAC,iBAAiB,EAAE,WAAW,CAAC,CAAC;YACjF,EAAE,CAAA,CAAC,UAAU,CAAC,QAAQ,EAAE,CAAC;gBAAC,IAAI,CAAC,KAAK,CAAC,MAAM,EAAC,UAAU,CAAC,QAAQ,CAAC,CAAC;YACjE,IAAI;gBAAC,IAAI,CAAC,KAAK,CAAC,MAAM,EAAC,cAAc,CAAC,mBAAmB,CAAC,CAAC;QAC/D,CAAC;QAAC,IAAI,CAAC,CAAC;YACJ,IAAI,CAAC,KAAK,CAAC,iBAAiB,EAAE,WAAW,CAAC,CAAC;YAC3C,IAAI,CAAC,KAAK,CAAC,MAAM,EAAC,cAAc,CAAC,mBAAmB,CAAC,CAAC;QAC1D,CAAC;IACL,CAAC;IACL,YAAC;AAAD,CAAC,AA9DD,CAA6B,SAAS,GA8DrC;AChED;IAA6B,kCAAK;IAM9B,wBAAY,OAAe;QACvB,kBAAM,aAAa,CAAC,cAAc,EAAE,OAAO,CAAC,CAAC;QAYjD,WAAM,GAAG,UAAC,cAAsB;YAC5B,IAAI,CAAC,GAAe,IAAI,CAAC,QAAQ,EAAE,CAAC;YACpC,IAAI,MAAM,GAAW,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC;YAGzC,IAAI,IAAY,CAAC;YACjB,IAAI,IAAY,CAAC;YACjB,IAAI,IAAY,CAAC;YACjB,IAAI,IAAY,CAAC;YACjB,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YACrC,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,WAAW,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,WAAW,CAAC,GAAG,CAAC,CAAC,CAAC;YAC9D,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YACrC,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,WAAW,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,WAAW,CAAC,GAAG,CAAC,CAAC,CAAC;YAC9D,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YACrC,IAAI;gBAAC,IAAI,GAAG,CAAC,CAAC;YACd,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YACrC,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,OAAO,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC;YAGtD,IAAI,MAAM,GAAQ,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE;iBAC9B,MAAM,CAAC,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC;iBACpB,KAAK,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,cAAc,CAAC,CAAC,CAAC;YAEvC,IAAI,KAAK,GAAQ,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,MAAM,CAAC;iBACvC,MAAM,CAAC,QAAQ,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAE/B,EAAE,CAAA,CAAC,IAAI,CAAC,uBAAuB,IAAI,IAAI,CAAC,uBAAuB,GAAG,CAAC,CAAC,CAAA,CAAC;gBACjE,KAAK,CAAC,aAAa,CAAC,CAAC,MAAM,CAAC,eAAe,CAAC,CAAC;YACjD,CAAC;YAED,IAAI,MAAM,GAAQ,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE;iBAC9B,MAAM,CAAC,CAAC,CAAC,EAAE,IAAI,CAAC,CAAC;iBACjB,KAAK,CAAC,CAAC,MAAM,CAAC,eAAe,EAAE,CAAC,CAAC,CAAC,CAAC;YACxC,IAAI,KAAK,GAAQ,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,MAAM,CAAC;iBACvC,MAAM,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAC7B,EAAE,CAAA,CAAC,IAAI,CAAC,yBAAyB,IAAI,IAAI,CAAC,yBAAyB,GAAG,CAAC,CAAC,CAAA,CAAC;gBACrE,KAAK,CAAC,aAAa,CAAC,CAAC,MAAM,CAAC,cAAc,CAAC,CAAC;YAChD,CAAC;YAID,EAAE,CAAA,CAAC,IAAI,CAAC,sBAAsB,KAAK,IAAI,CAAC;gBAAC,KAAK,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;YAE9D,EAAE,CAAA,CAAC,IAAI,CAAC,oBAAoB,KAAK,IAAI,CAAC;gBAAC,KAAK,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;YAG5D,IAAI,WAAW,GAAa,IAAI,CAAC,WAAW,CAAC;YAC7C,IAAI,WAAW,GAAa,IAAI,CAAC,WAAW,CAAC;YAC7C,IAAI,OAAO,GAAa,IAAI,CAAC,OAAO,CAAC;YAErC,IAAI,IAAI,GAAQ,WAAW,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE,CAAC;gBAC1C,MAAM,CAAC,EAAC,OAAO,EAAE,WAAW,CAAC,CAAC,CAAC,GAAG,WAAW,CAAC,CAAC,CAAC,EAAE,QAAQ,EAAE,OAAO,CAAC,CAAC,CAAC,EAAE,QAAQ,EAAE,WAAW,CAAC,CAAC,CAAC,EAAC,CAAC;YACtG,CAAC,CAAC,CAAC;YAGH,IAAI,GAAG,GAAG,EAAE,CAAC,MAAM,CAAC,GAAG,GAAG,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;iBAC/C,MAAM,CAAC,KAAK,CAAC;iBACb,KAAK,CAAC,MAAM,EAAE,MAAM,CAAC;iBACrB,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,QAAQ,EAAE,CAAC;iBAC3B,IAAI,CAAC,QAAQ,EAAE,CAAC,CAAC,SAAS,EAAE,CAAC;iBAC7B,IAAI,CAAC,SAAS,EAAE,MAAM,CAAC;iBACvB,MAAM,CAAC,GAAG,CAAC;iBACX,IAAI,CAAC,WAAW,EACb,YAAY,GAAG,MAAM,CAAC,IAAI,GAAG,GAAG,GAAG,MAAM,CAAC,GAAG,GAAG,GAAG,CAAC,CAAC;YAI7D,GAAG,CAAC,SAAS,CAAC,MAAM,CAAC;iBAChB,IAAI,CAAC,IAAI,CAAC;iBACV,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC;iBACtB,IAAI,CAAC,OAAO,EAAE,KAAK,CAAC;iBACpB,KAAK,CAAC,MAAM,EAAC,WAAW,CAAC;iBACzB,IAAI,CAAC,GAAG,EAAE,UAAS,CAAM,IAAI,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;iBACxD,IAAI,CAAC,OAAO,EAAE,UAAS,CAAM,IAAI,MAAM,CAAC,MAAM,CAAC,IAAI,GAAG,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC;iBACtE,IAAI,CAAC,GAAG,EAAE,UAAS,CAAM,IAAI,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;iBACxD,IAAI,CAAC,QAAQ,EAAE,UAAS,CAAM,IAAI,MAAM,CAAC,MAAM,CAAC,eAAe,GAAG,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YAG5F,IAAI,SAAS,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;iBACvB,IAAI,CAAC,WAAW,EAAE,cAAc,GAAG,MAAM,CAAC,eAAe,GAAG,GAAG,CAAC;iBAChE,KAAK,CAAC,QAAQ,EAAC,MAAM,CAAC;iBACtB,KAAK,CAAC,cAAc,EAAE,CAAC,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,kBAAkB,EAAE,IAAI,IAAI,GAAG,CAAC,CAAC,kBAAkB,EAAE,GAAG,cAAc,CAAC,yBAAyB,CAAC,CAAC;iBACxI,KAAK,CAAC,MAAM,EAAC,MAAM,CAAC;iBACpB,IAAI,CAAC,KAAK,CAAC,CAAC;YACjB,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,EAAC,SAAS,CAAC,CAAC;YAE5E,EAAE,CAAA,CAAC,IAAI,CAAC,uBAAuB,IAAI,IAAI,CAAC;gBAAC,SAAS,CAAC,SAAS,CAAC,YAAY,CAAC,CAAC,KAAK,CAAC,EAAC,cAAc,EAAE,IAAI,CAAC,uBAAuB,EAAC,CAAC,CAAC;YAGjI,IAAI,SAAS,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;iBACvB,KAAK,CAAC,QAAQ,EAAC,MAAM,CAAC;iBACtB,KAAK,CAAC,cAAc,EAAE,CAAC,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,kBAAkB,EAAE,IAAI,IAAI,GAAG,CAAC,CAAC,kBAAkB,EAAE,GAAG,cAAc,CAAC,yBAAyB,CAAC,CAAC;iBACxI,KAAK,CAAC,MAAM,EAAC,MAAM,CAAC;iBACpB,IAAI,CAAC,KAAK,CAAC,CAAC;YACjB,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,EAAC,SAAS,CAAC,CAAC;YAE5E,EAAE,CAAA,CAAC,IAAI,CAAC,yBAAyB,IAAI,IAAI,CAAC;gBAAC,SAAS,CAAC,SAAS,CAAC,YAAY,CAAC,CAAC,KAAK,CAAC,EAAC,cAAc,EAAE,IAAI,CAAC,yBAAyB,EAAC,CAAC,CAAC;YAGrI,EAAE,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC;gBACb,IAAI,UAAqB,CAAC;gBAC1B,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC;oBAAC,UAAU,GAAG,IAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC;gBACvD,KAAK,CAAC,WAAW,CAAC,GAAG,EAAE,IAAI,CAAC,KAAK,EAAE,MAAM,EAAE,UAAU,CAAC,CAAC;YAC3D,CAAC;QACL,CAAC,CAAA;QApHG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,cAAc,CAAC,CAAC,CAAC;QAGpF,IAAI,CAAC,WAAW,GAAG,IAAI,CAAC,aAAa,CAAC,CAAC;QACvC,IAAI,CAAC,WAAW,GAAG,IAAI,CAAC,aAAa,CAAC,CAAC;QACvC,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC;IACnC,CAAC;IA8GL,qBAAC;AAAD,CAAC,AA9HD,CAA6B,KAAK,GA8HjC;AC9HD;IAAwB,6BAAK;IAMzB,mBAAY,OAAe;QACvB,kBAAM,aAAa,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAU5C,WAAM,GAAG,UAAC,cAAsB;YAE5B,IAAI,OAAO,GAAW,CAAC,CAAC,IAAI,CAAC,KAAK,GAAG,CAAC,GAAG,IAAI,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC;YAC5D,IAAI,CAAC,GAAe,IAAI,CAAC,QAAQ,EAAE,CAAC;YACpC,IAAI,MAAM,GAAW,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC;YAGzC,IAAI,MAAM,GAAmC,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,cAAc,CAAC,CAAC,CAAC;YACjG,IAAI,MAAM,GAAmC,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,CAAC,MAAM,CAAC,eAAe,EAAE,CAAC,CAAC,CAAC,CAAC;YAGlG,IAAI,KAAK,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,MAAM,CAAC;iBAClC,MAAM,CAAC,QAAQ,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAC/B,EAAE,CAAA,CAAC,IAAI,CAAC,uBAAuB,IAAI,IAAI,IAAI,IAAI,CAAC,uBAAuB,GAAG,CAAC,CAAC,CAAA,CAAC;gBACzE,KAAK,CAAC,aAAa,CAAC,CAAC,MAAM,CAAC,eAAe,CAAC,CAAC;YACjD,CAAC;YAGD,IAAI,KAAK,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,MAAM,CAAC;iBAClC,MAAM,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAC7B,EAAE,CAAA,CAAC,IAAI,CAAC,yBAAyB,IAAI,IAAI,IAAI,IAAI,CAAC,yBAAyB,GAAG,CAAC,CAAC,CAAA,CAAC;gBAC7E,KAAK,CAAC,aAAa,CAAC,CAAC,MAAM,CAAC,cAAc,CAAC,CAAC;YAChD,CAAC;YAED,EAAE,CAAA,CAAC,IAAI,CAAC,sBAAsB,KAAK,IAAI,CAAC;gBAAC,KAAK,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;YAE9D,EAAE,CAAA,CAAC,IAAI,CAAC,oBAAoB,KAAK,IAAI,CAAC;gBAAC,KAAK,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;YAG5D,IAAI,SAAS,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE;iBACxB,CAAC,CAAC,UAAU,CAAM;gBACf,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC;YAC1B,CAAC,CAAC;iBACD,CAAC,CAAC,UAAU,CAAM;gBACf,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC;YAC1B,CAAC,CAAC,CAAC;YAIP,IAAI,GAAG,GAAG,EAAE,CAAC,MAAM,CAAC,GAAG,GAAG,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;iBAC/C,MAAM,CAAC,KAAK,CAAC;iBACb,KAAK,CAAC,cAAc,EAAE,CAAE,CAAC,IAAI,CAAC,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC,cAAc,EAAE,GAAG,cAAc,CAAC,0BAA0B,CAAC,CAAC;iBAClH,KAAK,CAAC,MAAM,EAAE,MAAM,CAAC;iBACrB,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,QAAQ,EAAE,CAAC;iBAC3B,IAAI,CAAC,QAAQ,EAAE,CAAC,CAAC,SAAS,EAAE,CAAC;iBAC7B,MAAM,CAAC,GAAG,CAAC;iBACX,IAAI,CAAC,WAAW,EAAE,YAAY,GAAG,MAAM,CAAC,IAAI,GAAG,GAAG,GAAG,MAAM,CAAC,GAAG,GAAG,GAAG,CAAC,CAAC;YAG5E,IAAI,IAAY,CAAC;YACjB,IAAI,IAAY,CAAC;YACjB,IAAI,IAAY,CAAC;YACjB,IAAI,IAAY,CAAC;YACjB,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,IAAI,IAAI,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YAC7C,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;YACvD,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,IAAI,IAAI,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YAC7C,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;YACvD,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,IAAI,IAAI,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YAC7C,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;YACvD,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,IAAI,IAAI,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YAC7C,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;YAEvD,MAAM,CAAC,MAAM,CAAC,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC,CAAC;YAC5B,MAAM,CAAC,MAAM,CAAC,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC,CAAC;YAG5B,IAAI,YAAY,GAA2B,EAAE,CAAC,KAAK,CAAC,UAAU,EAAE,CAAC;YACjE,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,EAAE,CAAC,EAAE,EAAE,CAAC;gBAC/B,IAAI,KAAK,GAAa,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;gBACpC,IAAI,KAAK,GAAa,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;gBAEpC,IAAI,IAAI,GAAU,KAAK,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE,CAAC;oBACtC,MAAM,CAAC,EAAC,MAAM,EAAE,KAAK,CAAC,CAAC,CAAC,EAAE,MAAM,EAAE,KAAK,CAAC,CAAC,CAAC,EAAC,CAAC;gBAChD,CAAC,CAAC,CAAC;gBAEH,GAAG,CAAC,MAAM,CAAC,MAAM,CAAC;qBACb,IAAI,CAAC,OAAO,EAAE,MAAM,CAAC;qBACrB,KAAK,CAAC,QAAQ,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,GAAG,YAAY,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;qBAC3F,IAAI,CAAC,GAAG,EAAE,SAAS,CAAC,IAAI,CAAC,CAAC,CAAC;YACpC,CAAC;YAGD,IAAI,SAAS,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;iBACvB,IAAI,CAAC,WAAW,EAAE,cAAc,GAAG,MAAM,CAAC,eAAe,GAAG,GAAG,CAAC;iBAChE,KAAK,CAAC,QAAQ,EAAC,MAAM,CAAC;iBACtB,KAAK,CAAC,cAAc,EAAE,CAAC,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,kBAAkB,EAAE,IAAI,IAAI,GAAG,CAAC,CAAC,kBAAkB,EAAE,GAAG,cAAc,CAAC,yBAAyB,CAAC,CAAC;iBACxI,KAAK,CAAC,MAAM,EAAC,MAAM,CAAC;iBACpB,IAAI,CAAC,KAAK,CAAC,CAAC;YACjB,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,EAAC,SAAS,CAAC,CAAC;YAE5E,EAAE,CAAA,CAAC,IAAI,CAAC,uBAAuB,IAAI,IAAI,CAAC;gBAAC,SAAS,CAAC,SAAS,CAAC,YAAY,CAAC,CAAC,KAAK,CAAC,EAAC,cAAc,EAAE,IAAI,CAAC,uBAAuB,EAAC,CAAC,CAAC;YAGjI,IAAI,SAAS,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;iBACvB,KAAK,CAAC,QAAQ,EAAC,MAAM,CAAC;iBACtB,KAAK,CAAC,cAAc,EAAE,CAAC,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,kBAAkB,EAAE,IAAI,IAAI,GAAG,CAAC,CAAC,kBAAkB,EAAE,GAAG,cAAc,CAAC,yBAAyB,CAAC,CAAC;iBACxI,KAAK,CAAC,MAAM,EAAC,MAAM,CAAC;iBACpB,IAAI,CAAC,KAAK,CAAC,CAAC;YACjB,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,EAAC,SAAS,CAAC,CAAC;YAE5E,EAAE,CAAA,CAAC,IAAI,CAAC,yBAAyB,IAAI,IAAI,CAAC;gBAAC,SAAS,CAAC,SAAS,CAAC,YAAY,CAAC,CAAC,KAAK,CAAC,EAAC,cAAc,EAAE,IAAI,CAAC,yBAAyB,EAAC,CAAC,CAAC;YAGrI,EAAE,CAAC,CAAC,IAAI,CAAC,WAAW,IAAI,IAAI,CAAC,UAAU,KAAK,IAAI,CAAC,CAAC,CAAC;gBAC/C,IAAI,WAAW,GAAG,MAAM,CAAC,cAAc,GAAG,CAAC,CAAC;gBAC5C,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,EAAE,CAAC,EAAE,EAAE,CAAC;oBAC/B,IAAI,MAAM,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;oBAC3B,IAAI,OAAO,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;oBAC5B,IAAI,KAAK,GAAG,MAAM,CAAC,MAAM,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;oBACtC,IAAI,KAAK,GAAG,OAAO,CAAC,OAAO,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;oBACxC,IAAI,SAAS,GAAG,IAAI,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC;oBACpC,GAAG,CAAC,MAAM,CAAC,MAAM,CAAC;yBACb,IAAI,CAAC,GAAG,EAAE,CAAC,WAAW,GAAG,CAAC,CAAC,GAAG,CAAC,GAAG,WAAW,CAAC;yBAC9C,IAAI,CAAC,GAAG,EAAE,MAAM,CAAC,eAAe,GAAG,CAAC,MAAM,CAAC,MAAM,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC;yBAC3D,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;yBACvB,KAAK,CAAC,MAAM,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,GAAG,YAAY,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;yBACzF,IAAI,CAAC,SAAS,CAAC,CAAC;gBACzB,CAAC;YACL,CAAC;YAGD,EAAE,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC;gBACb,IAAI,UAAqB,CAAC;gBAC1B,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC;oBAAC,UAAU,GAAG,IAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC;gBACvD,KAAK,CAAC,WAAW,CAAC,GAAG,EAAE,IAAI,CAAC,KAAK,EAAE,MAAM,EAAE,UAAU,CAAC,CAAC;YAC3D,CAAC;QACL,CAAC,CAAA;QAxIG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,SAAS,CAAC,CAAC,CAAC;QAE/E,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC;QACvB,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC;QACvB,IAAI,CAAC,WAAW,GAAG,IAAI,CAAC,aAAa,CAAC,CAAC;IAC3C,CAAC;IAmIL,gBAAC;AAAD,CAAC,AAlJD,CAAwB,KAAK,GAkJ5B;AClJD;IAA2B,gCAAK;IAM5B,sBAAY,OAAc;QACtB,kBAAM,aAAa,CAAC,YAAY,EAAE,OAAO,CAAC,CAAC;QAW/C,WAAM,GAAG,UAAC,cAAqB;YAE3B,IAAI,OAAO,GAAU,CAAC,CAAC,IAAI,CAAC,KAAK,GAAG,CAAC,GAAG,IAAI,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC;YAC3D,IAAI,CAAC,GAAc,IAAI,CAAC,QAAQ,EAAE,CAAC;YACnC,IAAI,MAAM,GAAU,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC;YAGxC,IAAI,MAAM,GAAkC,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,cAAc,CAAC,CAAC,CAAC;YAChG,IAAI,MAAM,GAAkC,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,CAAC,MAAM,CAAC,eAAe,EAAE,CAAC,CAAC,CAAC,CAAC;YAGjG,IAAI,KAAK,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,MAAM,CAAC;iBAClC,aAAa,CAAC,CAAC,MAAM,CAAC,eAAe,CAAC;iBACtC,MAAM,CAAC,QAAQ,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAC/B,IAAI,KAAK,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,MAAM,CAAC;iBAClC,aAAa,CAAC,CAAC,MAAM,CAAC,cAAc,CAAC;iBACrC,MAAM,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAE7B,EAAE,CAAC,CAAC,IAAI,CAAC,sBAAsB,KAAK,IAAI,CAAC;gBAAC,KAAK,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;YAE/D,EAAE,CAAC,CAAC,IAAI,CAAC,oBAAoB,KAAK,IAAI,CAAC;gBAAC,KAAK,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;YAI7D,IAAI,GAAG,GAAG,EAAE,CAAC,MAAM,CAAC,GAAG,GAAG,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;iBAC/C,MAAM,CAAC,KAAK,CAAC;iBACb,KAAK,CAAC,cAAc,EAAE,CAAE,CAAC,IAAI,CAAC,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC,CAAC;iBAC1E,KAAK,CAAC,MAAM,EAAE,MAAM,CAAC;iBACrB,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,QAAQ,EAAE,CAAC;iBAC3B,IAAI,CAAC,QAAQ,EAAE,CAAC,CAAC,SAAS,EAAE,CAAC;iBAC7B,IAAI,CAAC,SAAS,EAAE,MAAM,CAAC;iBACvB,MAAM,CAAC,GAAG,CAAC;iBACX,IAAI,CAAC,WAAW,EACb,YAAY,GAAG,MAAM,CAAC,IAAI,GAAG,GAAG,GAAG,MAAM,CAAC,GAAG,GAAG,GAAG,CAAC,CAAC;YAG7D,IAAI,IAAW,CAAC;YAChB,IAAI,IAAW,CAAC;YAChB,IAAI,IAAW,CAAC;YAChB,IAAI,IAAW,CAAC;YAChB,EAAE,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YACtC,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;YACvD,EAAE,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YACtC,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;YACvD,EAAE,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YACtC,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;YACvD,EAAE,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC;gBAAC,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC;YACtC,IAAI;gBAAC,IAAI,GAAG,CAAC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;YAEvD,MAAM,CAAC,MAAM,CAAC,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC,CAAC;YAC5B,MAAM,CAAC,MAAM,CAAC,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC,CAAC;YAG5B,IAAI,YAAY,GAA0B,EAAE,CAAC,KAAK,CAAC,UAAU,EAAE,CAAC;YAChE,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,EAAE,CAAC,EAAE,EAAE,CAAC;gBAC/B,IAAI,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;gBAC1B,IAAI,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;gBAE1B,IAAI,IAAI,GAAG,KAAK,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE,CAAC;oBAC/B,MAAM,CAAC,EAAC,MAAM,EAAE,KAAK,CAAC,CAAC,CAAC,EAAE,MAAM,EAAE,KAAK,CAAC,CAAC,CAAC,EAAC,CAAC;gBAChD,CAAC,CAAC,CAAC;gBAEH,GAAG,CAAC,SAAS,CAAC,QAAQ,CAAC;qBAClB,IAAI,CAAC,IAAI,CAAC;qBACV,KAAK,EAAE;qBACP,MAAM,CAAC,QAAQ,CAAC;qBAChB,KAAK,CAAC,MAAM,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,GAAG,YAAY,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;qBACzF,IAAI,CAAC,GAAG,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,YAAY,EAAE,GAAG,CAAC,CAAC,YAAY,EAAE,GAAG,cAAc,CAAC,wBAAwB,CAAC,CAAC;qBAC/F,IAAI,CAAC,IAAI,EAAE,UAAU,CAAC;oBACnB,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC;gBAC7B,CAAC,CAAC;qBACD,IAAI,CAAC,IAAI,EAAE,UAAU,CAAC;oBACnB,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC;gBAC7B,CAAC,CAAC,CAAC;YACX,CAAC;YAGD,IAAI,SAAS,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;iBACvB,IAAI,CAAC,WAAW,EAAE,cAAc,GAAG,MAAM,CAAC,eAAe,GAAG,GAAG,CAAC;iBAChE,KAAK,CAAC,QAAQ,EAAE,MAAM,CAAC;iBACvB,KAAK,CAAC,cAAc,EAAE,CAAC,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,kBAAkB,EAAE,IAAI,IAAI,GAAG,CAAC,CAAC,kBAAkB,EAAE,GAAG,cAAc,CAAC,yBAAyB,CAAC,CAAC;iBACxI,KAAK,CAAC,MAAM,EAAE,MAAM,CAAC;iBACrB,IAAI,CAAC,KAAK,CAAC,CAAC;YACjB,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,EAAC,SAAS,CAAC,CAAC;YAE5E,EAAE,CAAC,CAAC,IAAI,CAAC,uBAAuB,IAAI,IAAI,CAAC;gBAAC,SAAS,CAAC,SAAS,CAAC,YAAY,CAAC,CAAC,KAAK,CAAC,EAAC,cAAc,EAAE,IAAI,CAAC,uBAAuB,EAAC,CAAC,CAAC;YAGlI,IAAI,SAAS,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;iBACvB,KAAK,CAAC,QAAQ,EAAE,MAAM,CAAC;iBACvB,KAAK,CAAC,cAAc,EAAE,CAAC,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,kBAAkB,EAAE,IAAI,IAAI,GAAG,CAAC,CAAC,kBAAkB,EAAE,GAAG,cAAc,CAAC,yBAAyB,CAAC,CAAC;iBACxI,KAAK,CAAC,MAAM,EAAE,MAAM,CAAC;iBACrB,IAAI,CAAC,KAAK,CAAC,CAAC;YACjB,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,EAAC,SAAS,CAAC,CAAC;YAE5E,EAAE,CAAC,CAAC,IAAI,CAAC,yBAAyB,IAAI,IAAI,CAAC;gBAAC,SAAS,CAAC,SAAS,CAAC,YAAY,CAAC,CAAC,KAAK,CAAC,EAAC,cAAc,EAAE,IAAI,CAAC,yBAAyB,EAAC,CAAC,CAAC;YAGtI,EAAE,CAAC,CAAC,IAAI,CAAC,WAAW,IAAI,IAAI,CAAC,UAAU,KAAK,IAAI,CAAC,CAAC,CAAC;gBAC/C,IAAI,WAAW,GAAG,MAAM,CAAC,cAAc,GAAG,CAAC,CAAC;gBAC5C,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,EAAE,CAAC,EAAE,EAAE,CAAC;oBAC/B,IAAI,MAAM,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;oBAC3B,IAAI,OAAO,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;oBAC5B,IAAI,KAAK,GAAG,MAAM,CAAC,MAAM,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;oBACtC,IAAI,KAAK,GAAG,OAAO,CAAC,OAAO,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;oBACxC,IAAI,SAAS,CAAC;oBACd,EAAE,CAAC,CAAC,CAAC,KAAK,IAAI,CAAC,KAAK,CAAC;wBAAC,SAAS,GAAG,IAAI,CAAC,WAAW,CAAC,CAAC,CAAC,GAAG,YAAY,CAAC;oBACrE,IAAI;wBAAC,SAAS,GAAG,IAAI,CAAC,WAAW,CAAC,CAAC,CAAC,GAAG,IAAI,GAAG,KAAK,CAAC,WAAW,CAAC,CAAC,CAAC,GAAG,GAAG,GAAG,KAAK,CAAC,WAAW,CAAC,CAAC,CAAC,GAAG,GAAG,CAAC;oBACtG,GAAG,CAAC,MAAM,CAAC,MAAM,CAAC;yBACb,IAAI,CAAC,GAAG,EAAE,CAAC,WAAW,GAAG,CAAC,CAAC,GAAG,CAAC,GAAG,WAAW,CAAC;yBAC9C,IAAI,CAAC,GAAG,EAAE,MAAM,CAAC,eAAe,GAAG,CAAC,MAAM,CAAC,MAAM,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC;yBAC3D,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;yBACvB,KAAK,CAAC,MAAM,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,GAAG,YAAY,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;yBACzF,IAAI,CAAC,SAAS,CAAC,CAAC;gBACzB,CAAC;YACL,CAAC;YAGD,EAAE,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC;gBACb,IAAI,UAAqB,CAAC;gBAC1B,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC;oBAAC,UAAU,GAAG,IAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC;gBACvD,KAAK,CAAC,WAAW,CAAC,GAAG,EAAE,IAAI,CAAC,KAAK,EAAE,MAAM,EAAE,UAAU,CAAC,CAAC;YAC3D,CAAC;QACL,CAAC,CAAA;QAtIG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,YAAY,CAAC,CAAC,CAAC;QAElF,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC;QACvB,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC;QACvB,IAAI,CAAC,WAAW,GAAG,IAAI,CAAC,aAAa,CAAC,CAAC;IAC3C,CAAC;IAiIL,mBAAC;AAAD,CAAC,AAhJD,CAA2B,KAAK,GAgJ/B;ACpJD;IAAA;IAiEA,CAAC;IA9DkB,cAAO,GAAW,EAAE,CAAC;IACrB,cAAO,GAAW,EAAE,CAAC;IACrB,cAAO,GAAW,CAAC,CAAC;IACpB,iBAAU,GAAW,EAAE,CAAC;IACxB,cAAO,GAAW,EAAE,CAAC;IACrB,gBAAS,GAAW,SAAS,CAAC;IAC9B,oBAAa,GAAW,IAAI,CAAC;IAC7B,wBAAiB,GAAW,SAAS,CAAC;IAG9C,eAAQ,GAAG,CAAC,UAAS,CAAM;QAE9B,IAAI,GAAG,GAAG,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,QAAQ,CAAC,wBAAwB,CAAC,CAAC,CAAC;QAC1D,IAAI,SAAS,GAAG,CAAC,CAAC,SAAS,CAAC,YAAY,CAAC,CAAC,IAAI,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC;QACvD,IAAI,WAAW,GAAG,CAAC,CAAC,SAAS,CAAC,gBAAgB,CAAC,CAAC,IAAI,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC;QAE7D,SAAS,CAAC,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC,IAAI,CAAC,OAAO,EAAC,WAAW,CAAC,CAAC;QAC3D,WAAW,CAAC,KAAK,EAAE,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,OAAO,EAAC,eAAe,CAAC,CAAC;QAE9D,IAAI,cAAc,GAAU,EAAE,CAAC;QAC/B,GAAG,CAAC,SAAS,CAAC,eAAe,CAAC,CAAC,IAAI,CAAC;YAChC,IAAI,OAAO,GAAG,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;YAC9B,cAAc,CAAC,IAAI,CAAC;gBAChB,KAAK,EAAE,OAAO,CAAC,IAAI,CAAC,aAAa,CAAC;gBAClC,KAAK,EAAE,OAAO,CAAC,KAAK,CAAC,MAAM,CAAC;aAC/B,CAAC,CAAC;QACP,CAAC,CAAC,CAAC;QAIH,WAAW,CAAC,SAAS,CAAC,MAAM,CAAC;aACxB,IAAI,CAAC,cAAc,EAAC,UAAS,CAAC,IAAI,MAAM,CAAC,CAAC,CAAC,KAAK,CAAA,CAAA,CAAC,CAAC;aAClD,IAAI,CAAC,UAAS,CAAC,IAAI,CAAC,CAAC,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC,CAAA,CAAA,CAAC,CAAC;aAC7C,IAAI,CAAC,UAAS,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,CAAC,MAAM,EAAE,CAAA,CAAA,CAAC,CAAC;aACtC,IAAI,CAAC,GAAG,EAAC,CAAC,CAAC;aACX,IAAI,CAAC,GAAG,EAAC,UAAS,CAAC,EAAC,CAAC,IAAI,MAAM,CAAC,CAAC,GAAC,MAAM,CAAC,UAAU,GAAC,MAAM,CAAC,OAAO,GAAC,IAAI,CAAA,CAAA,CAAC,CAAC;aACzE,IAAI,CAAC,OAAO,EAAC,MAAM,CAAC,OAAO,CAAC;aAC5B,IAAI,CAAC,QAAQ,EAAC,MAAM,CAAC,OAAO,CAAC;aAE7B,KAAK,CAAC,MAAM,EAAC,UAAS,CAAC,IAAI,MAAM,CAAC,CAAC,CAAC,KAAK,CAAA,CAAA,CAAC,CAAC,CAAC;QAGjD,WAAW,CAAC,SAAS,CAAC,MAAM,CAAC;aACxB,IAAI,CAAC,cAAc,EAAC,UAAS,CAAC,IAAI,MAAM,CAAC,CAAC,CAAC,KAAK,CAAA,CAAA,CAAC,CAAC;aAClD,IAAI,CAAC,UAAS,CAAC,IAAI,CAAC,CAAC,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC,CAAA,CAAA,CAAC,CAAC;aAC7C,IAAI,CAAC,UAAS,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,CAAC,MAAM,EAAE,CAAA,CAAA,CAAC,CAAC;aACtC,IAAI,CAAC,GAAG,EAAC,UAAS,CAAC,EAAC,CAAC,IAAI,MAAM,CAAC,CAAC,GAAC,MAAM,CAAC,UAAU,GAAG,IAAI,CAAA,CAAA,CAAC,CAAC;aAC5D,IAAI,CAAC,GAAG,EAAC,CAAC,MAAM,CAAC,OAAO,GAAG,MAAM,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC;aAClD,IAAI,CAAC,UAAS,CAAC,IAAI,MAAM,CAAC,CAAC,CAAC,KAAK,CAAA,CAAA,CAAC,CAAC,CAAC;QAGzC,IAAI,iBAAiB,GAAQ,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,OAAO,EAAE,CAAC;QACzD,SAAS,CAAC,IAAI,CAAC,GAAG,EAAC,CAAC,iBAAiB,CAAC,CAAC,GAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aACnD,IAAI,CAAC,GAAG,EAAC,CAAC,iBAAiB,CAAC,CAAC,GAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aAC9C,IAAI,CAAC,QAAQ,EAAC,CAAC,iBAAiB,CAAC,MAAM,GAAC,CAAC,GAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aAC1D,IAAI,CAAC,OAAO,EAAC,CAAC,iBAAiB,CAAC,KAAK,GAAC,CAAC,GAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aACxD,KAAK,CAAC,MAAM,EAAC,MAAM,CAAC,SAAS,CAAC;aAC9B,KAAK,CAAC,QAAQ,EAAC,MAAM,CAAC,iBAAiB,CAAC;aACxC,KAAK,CAAC,SAAS,EAAC,MAAM,CAAC,aAAa,CAAC,CAAC;QAE3C,GAAG,CAAC,SAAS,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,WAAW,EAAC,YAAY,GAAG,MAAM,CAAC,OAAO,GAAG,GAAG,GAAG,MAAM,CAAC,OAAO,GAAG,GAAG,CAAC,CAAC;IAC1G,CAAC,CAAC,CAAC;IACP,aAAC;AAAD,CAAC,AAjED,IAiEC;AC1DD;IAA+B,oCAAK;IAKhC,0BAAY,OAAe;QACvB,kBAAM,aAAa,CAAC,gBAAgB,EAAE,OAAO,CAAC,CAAC;QAYnD,WAAM,GAAG,UAAC,cAAsB;YAE5B,IAAI,OAAO,GAAW,CAAC,CAAC,IAAI,CAAC,KAAK,GAAG,CAAC,GAAG,IAAI,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC;YAC5D,IAAI,CAAC,GAAe,IAAI,CAAC,QAAQ,EAAE,CAAC;YACpC,IAAI,MAAM,GAAW,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC;YAGzC,IAAI,MAAM,GAAmC,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,MAAM,CAAC,cAAc,CAAC,CAAC,CAAC;YACjG,IAAI,MAAM,GAAmC,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,KAAK,CAAC,CAAC,MAAM,CAAC,eAAe,EAAE,CAAC,CAAC,CAAC,CAAC;YAGlG,IAAI,KAAK,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,MAAM,CAAC;iBAClC,MAAM,CAAC,QAAQ,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAC/B,EAAE,CAAA,CAAC,IAAI,CAAC,uBAAuB,IAAI,IAAI,IAAI,IAAI,CAAC,uBAAuB,GAAG,CAAC,CAAC,CAAA,CAAC;gBACzE,KAAK,CAAC,aAAa,CAAC,CAAC,MAAM,CAAC,eAAe,CAAC,CAAC;YACjD,CAAC;YAGD,IAAI,KAAK,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,MAAM,CAAC;iBAClC,MAAM,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAC7B,EAAE,CAAA,CAAC,IAAI,CAAC,yBAAyB,IAAI,IAAI,IAAI,IAAI,CAAC,yBAAyB,GAAG,CAAC,CAAC,CAAA,CAAC;gBAC7E,KAAK,CAAC,aAAa,CAAC,CAAC,MAAM,CAAC,cAAc,CAAC,CAAC;YAChD,CAAC;YAED,EAAE,CAAA,CAAC,IAAI,CAAC,sBAAsB,KAAK,IAAI,CAAC;gBAAC,KAAK,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;YAE9D,EAAE,CAAA,CAAC,IAAI,CAAC,oBAAoB,KAAK,IAAI,CAAC;gBAAC,KAAK,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;YAE5D,IAAI,IAAI,GAAU,EAAE,CAAC;YACrB,GAAG,CAAA,CAAC,IAAI,CAAC,GAAC,CAAC,EAAE,CAAC,GAAC,IAAI,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;gBACpC,IAAI,GAAG,GAAG,EAAE,CAAC;gBACb,GAAG,CAAA,CAAE,IAAI,CAAC,GAAC,CAAC,EAAE,CAAC,GAAC,IAAI,CAAC,MAAM,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;oBACtC,GAAG,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;oBACvC,GAAG,CAAC,QAAQ,CAAC,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;gBAClC,CAAC;gBACD,IAAI,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;YACnB,CAAC;YAED,IAAI,IAAI,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE;iBACnB,CAAC,CAAC,UAAS,CAAM,IAAI,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;iBAChD,EAAE,CAAC,UAAS,CAAM,IAAI,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC;iBAC7C,EAAE,CAAC,UAAS,CAAM,IAAI,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YAEzD,IAAI,KAAK,GAAG,EAAE,CAAC,MAAM,CAAC,KAAK,EAAE;iBACxB,MAAM,CAAC,UAAS,CAAM,IAAI,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;YAEnD,IAAI,GAAG,GAAG,EAAE,CAAC,MAAM,CAAC,GAAG,GAAG,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC;iBAC7D,IAAI,CAAC,OAAO,EAAE,MAAM,CAAC,cAAc,GAAG,MAAM,CAAC,IAAI,GAAG,MAAM,CAAC,KAAK,CAAC;iBACjE,IAAI,CAAC,QAAQ,EAAE,MAAM,CAAC,eAAe,GAAG,MAAM,CAAC,GAAG,GAAG,MAAM,CAAC,MAAM,CAAC;iBACnE,MAAM,CAAC,GAAG,CAAC;iBACX,IAAI,CAAC,WAAW,EAAE,YAAY,GAAG,MAAM,CAAC,IAAI,GAAG,GAAG,GAAG,MAAM,CAAC,GAAG,GAAG,GAAG,CAAC,CAAC;YAE5E,IAAI,KAAK,GAAQ,EAAE,CAAC,KAAK,CAAC,UAAU,EAAE,CAAC;YACvC,KAAK,CAAC,MAAM,CAAC,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,UAAU,GAAG;gBAC9C,MAAM,CAAC,GAAG,KAAK,QAAQ,CAAC;YAC5B,CAAC,CAAC,CAAC,CAAC;YAEJ,IAAI,QAAQ,GAAG,KAAK,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,UAAU,IAAI;gBAClD,MAAM,CAAC;oBACH,IAAI,EAAE,IAAI;oBACV,MAAM,EAAE,IAAI,CAAC,GAAG,CAAC,UAAU,CAAC;wBACxB,MAAM,CAAC,EAAC,MAAM,EAAE,CAAC,CAAC,MAAM,EAAE,CAAC,EAAE,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAC,CAAC;oBAC9C,CAAC,CAAC;iBACL,CAAC;YACN,CAAC,CAAC,CAAC,CAAC;YAGJ,IAAI,IAAI,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,UAAU,CAAC;gBAC/B,IAAI,IAAI,GAAG,EAAE,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,UAAU,GAAG;oBACnC,MAAM,CAAC,GAAG,KAAK,QAAQ,GAAG,CAAC,CAAC,GAAG,CAAC,GAAG,CAAC,CAAA;gBACxC,CAAC,CAAC,CAAC;gBACH,MAAM,CAAC,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC;YACxB,CAAC,CAAC,CAAC;YAGH,MAAM,CAAC,MAAM,CAAC,EAAE,CAAC,MAAM,CAAC,IAAI,EAAE,UAAU,CAAC;gBACrC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC;YACpB,CAAC,CAAC,CAAC,CAAC;YAEJ,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,IAAI,CAAC,CAAC,CAAC;YAEzB,IAAI,OAAO,GAAG,GAAG,CAAC,SAAS,CAAC,UAAU,CAAC;iBAClC,IAAI,CAAC,QAAQ,CAAC;iBACd,KAAK,EAAE,CAAC,MAAM,CAAC,GAAG,CAAC;iBACnB,IAAI,CAAC,OAAO,EAAE,SAAS,CAAC,CAAC;YAE9B,IAAI,UAAU,GAAG,IAAI,CAAC,MAAM,CAAC;YAE7B,IAAI,YAAY,GAA2B,EAAE,CAAC,KAAK,CAAC,UAAU,EAAE,CAAC;YACjE,OAAO,CAAC,MAAM,CAAC,MAAM,CAAC;iBACjB,IAAI,CAAC,OAAO,EAAE,MAAM,CAAC;iBACrB,IAAI,CAAC,aAAa,EAAC,UAAS,CAAM,IAAI,MAAM,CAAC,CAAC,CAAC,IAAI,CAAA,CAAA,CAAC,CAAC;iBACrD,IAAI,CAAC,GAAG,EAAE,UAAU,CAAM;gBACvB,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC;YAC1B,CAAC,CAAC;iBACD,KAAK,CAAC,MAAM,EAAE,UAAS,CAAM;gBAC1B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,CAAC,cAAc,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAA,CAAC;oBAClD,MAAM,CAAC,CAAC,CAAC,cAAc,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC;gBACxD,CAAC;gBAAC,IAAI,CAAA,CAAC;oBACH,MAAM,CAAC,YAAY,CAAC,MAAM,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAA;gBAC3D,CAAC;YACL,CAAC,CAAC;iBACD,KAAK,CAAC,EAAC,cAAc,EAAE,KAAK,EAAC,CAAC,CAAC;YAGpC,IAAI,SAAS,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;iBACvB,KAAK,CAAC,QAAQ,EAAC,MAAM,CAAC;iBACtB,KAAK,CAAC,cAAc,EAAE,CAAC,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,kBAAkB,EAAE,IAAI,IAAI,GAAG,CAAC,CAAC,kBAAkB,EAAE,GAAG,cAAc,CAAC,yBAAyB,CAAC,CAAC;iBACxI,KAAK,CAAC,MAAM,EAAC,MAAM,CAAC;iBACpB,IAAI,CAAC,WAAW,EAAE,cAAc,GAAG,MAAM,CAAC,eAAe,GAAG,GAAG,CAAC;iBAChE,IAAI,CAAC,KAAK,CAAC,CAAC;YACjB,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,EAAC,SAAS,CAAC,CAAC;YAG5E,IAAI,SAAS,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,OAAO,EAAE,QAAQ,CAAC;iBACvB,KAAK,CAAC,QAAQ,EAAC,MAAM,CAAC;iBACtB,KAAK,CAAC,cAAc,EAAE,CAAC,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,kBAAkB,EAAE,IAAI,IAAI,GAAG,CAAC,CAAC,kBAAkB,EAAE,GAAG,cAAc,CAAC,yBAAyB,CAAC,CAAC;iBACxI,KAAK,CAAC,MAAM,EAAC,MAAM,CAAC;iBACpB,IAAI,CAAC,KAAK,CAAC,CAAC;YACjB,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,EAAC,SAAS,CAAC,CAAC;YAG5E,EAAE,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC;gBACb,IAAI,UAAqB,CAAC;gBAC1B,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC;oBAAC,UAAU,GAAG,IAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC;gBACvD,KAAK,CAAC,WAAW,CAAC,GAAG,EAAE,IAAI,CAAC,KAAK,EAAE,MAAM,EAAE,UAAU,CAAC,CAAC;YAC3D,CAAC;YAGD,IAAI,MAAM,GAAQ,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC5B,IAAI,CAAC,OAAO,EAAC,QAAQ,CAAC;iBACtB,IAAI,CAAC,WAAW,EAAC,kBAAkB,CAAC;iBACpC,KAAK,CAAC,WAAW,EAAC,MAAM,CAAC;iBACzB,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC;QAC/B,CAAC,CAAA;QAlJG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,gBAAgB,CAAC,CAAC,CAAC;QAGtF,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC;QACvB,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC;QACvB,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC;IACjC,CAAC;IA4IL,uBAAC;AAAD,CAAC,AA3JD,CAA+B,KAAK,GA2JnC;AC9JD;IAA4B,iCAAK;IAgC7B,uBAAY,OAAc;QACtB,kBAAM,aAAa,CAAC,aAAa,EAAE,OAAO,CAAC,CAAC;QAUhD,WAAM,GAAG,UAAC,cAAqB;YAC3B,IAAI,QAAQ,GAAG,IAAI,CAAC;YACpB,IAAI,CAAC,GAAc,IAAI,CAAC,QAAQ,EAAE,CAAC;YACnC,IAAI,MAAM,GAAU,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC;YAGxC,IAAI,CAAC,QAAQ,GAAG,EAAE,CAAC;YACnB,IAAI,KAAK,GAAG,CAAC,CAAC;YACd,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;gBAC5C,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;oBAC/C,IAAI,GAAG,GAAG,EAAE,CAAC;oBACb,GAAG,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,aAAa,CAAC,CAAC;oBAClD,GAAG,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,WAAW,CAAC,CAAC;oBAC9C,GAAG,CAAC,IAAI,CAAC,GAAG,KAAK,EAAE,CAAC;oBACpB,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;oBAChB,GAAG,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;oBAC5C,GAAG,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;oBACjD,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;gBAC5B,CAAC;YACL,CAAC;YAED,IAAI,CAAC,KAAK,GAAG,EAAE,CAAC;YAChB,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;gBAC7C,IAAI,GAAG,GAAG,EAAE,CAAC;gBACb,GAAG,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC;gBACjC,GAAG,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;gBACd,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;YACzB,CAAC;YAID,IAAI,GAAG,GAAG,EAAE,CAAC,MAAM,CAAC,GAAG,GAAG,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;iBAC/C,MAAM,CAAC,KAAK,CAAC;iBACb,KAAK,CAAC,cAAc,EAAE,CAAE,CAAC,IAAI,CAAC,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC,cAAc,EAAE,GAAG,cAAc,CAAC,0BAA0B,CAAC,CAAC;iBAClH,KAAK,CAAC,MAAM,EAAE,MAAM,CAAC;iBACrB,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,QAAQ,EAAE,CAAC;iBAC3B,IAAI,CAAC,QAAQ,EAAE,CAAC,CAAC,SAAS,EAAE,CAAC;iBAC7B,MAAM,CAAC,GAAG,CAAC,CAAC;YAEjB,IAAI,eAAe,GAAG,CAAC,CAAC,SAAS,EAAE,GAAG,MAAM,CAAC,GAAG,GAAG,MAAM,CAAC,MAAM,CAAC;YACjE,IAAI,cAAc,GAAG,CAAC,CAAC,QAAQ,EAAE,GAAG,MAAM,CAAC,IAAI,GAAG,MAAM,CAAC,KAAK,CAAC;YAC/D,IAAI,UAAU,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,aAAa,CAAC,mBAAmB,CAAC;YAC3E,IAAI,UAAU,GAAG,CAAC,CAAC,SAAS,EAAE,GAAG,UAAU,GAAG,MAAM,CAAC,GAAG,GAAG,MAAM,CAAC,MAAM,GAAG,EAAE,CAAC;YAE9E,IAAI,OAAO,GAAU,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,QAAQ,EAAE,UAAU,CAAK,IAAI,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;YACjF,IAAI,OAAO,GAAU,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,QAAQ,EAAE,UAAU,CAAK,IAAI,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC;YAC/E,IAAI,CAAC,CAAC,GAAG,EAAE,CAAC,IAAI,CAAC,KAAK,EAAE;iBACnB,MAAM,CAAC,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;iBAC1B,KAAK,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC,CAAC;YAChC,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,cAAc,CAAC,CAAC,CAAC;YAErD,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,UAAU,CAAC,CAAC,CAAC;YACtF,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,IAAI,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,UAAU,CAAC,CAAC,CAAC;YAGtF,IAAI,CAAC,IAAI,GAAG,GAAG,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC,MAAM,CAAC,UAAU,CAAC;iBAC5C,IAAI,CAAC,IAAI,EAAE,MAAM,CAAC;iBAClB,MAAM,CAAC,MAAM,CAAC;iBACd,IAAI,CAAC,OAAO,EAAE,cAAc,CAAC;iBAC7B,IAAI,CAAC,QAAQ,EAAE,CAAC,CAAC,SAAS,EAAE,GAAG,GAAG,CAAC,CAAC;YAEzC,IAAI,CAAC,QAAQ,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,WAAW,EAAE,YAAY,GAAG,MAAM,CAAC,IAAI,GAAG,GAAG,GAAG,MAAM,CAAC,GAAG,GAAG,GAAG,CAAC;iBACtE,IAAI,CAAC,OAAO,EAAE,cAAc,CAAC;iBAC7B,IAAI,CAAC,QAAQ,EAAE,UAAU,CAAC;iBAC1B,IAAI,CAAC,WAAW,EAAE,MAAM,CAAC;iBACzB,IAAI,CAAC,MAAM,EAAE,YAAY,CAAC,CAAC;YAEhC,IAAI,CAAC,QAAQ,GAAG,GAAG,CAAC,MAAM,CAAC,GAAG,CAAC;iBAC1B,IAAI,CAAC,WAAW,EAAE,YAAY,GAAG,MAAM,CAAC,IAAI,GAAG,GAAG,GAAG,CAAC,UAAU,GAAG,MAAM,CAAC,GAAG,GAAG,EAAE,CAAC,GAAG,GAAG,CAAC;iBAC1F,IAAI,CAAC,OAAO,EAAE,cAAc,CAAC;iBAC7B,IAAI,CAAC,QAAQ,EAAE,UAAU,CAAC;iBAC1B,IAAI,CAAC,WAAW,EAAE,MAAM,CAAC;iBACzB,IAAI,CAAC,MAAM,EAAE,YAAY,CAAC,CAAC;YAGhC,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC,SAAS,CAAC,YAAY,CAAC;iBAC5C,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC;iBAChB,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC;iBACtB,IAAI,CAAC,IAAI,EAAE,CAAC,CAAC;iBACb,IAAI,CAAC,IAAI,EAAE,UAAU,CAAK;gBACvB,MAAM,CAAC,EAAE,CAAC,KAAK,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,GAAG,GAAG,CAAC;YAC7C,CAAC,CAAC;iBACD,IAAI,CAAC,IAAI,EAAE,cAAc,CAAC;iBAC1B,IAAI,CAAC,IAAI,EAAE,UAAU,CAAK;gBACvB,MAAM,CAAC,EAAE,CAAC,KAAK,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,GAAG,GAAG,CAAC;YAC7C,CAAC,CAAC;iBACD,IAAI,CAAC,QAAQ,EAAE,WAAW,CAAC;iBAC3B,IAAI,CAAC,cAAc,EAAE,CAAC,CAAC,CAAC;YAG7B,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC,SAAS,CAAC,WAAW,CAAC;iBAC3C,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC;iBAChB,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC;iBACtB,IAAI,CAAC,UAAU,CAAK;gBACjB,EAAE,CAAA,CAAC,CAAC,CAAC,KAAK,CAAC;oBAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC;gBAC3B,MAAM,CAAC,EAAE,CAAC;YACd,CAAC,CAAC;iBACD,IAAI,CAAC,GAAG,EAAE,CAAC,EAAE,CAAC;iBACd,IAAI,CAAC,GAAG,EAAE,UAAU,CAAK;gBACtB,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,EAAE,CAAC,CAAC;YAClC,CAAC,CAAC;iBACD,IAAI,CAAC,aAAa,EAAE,KAAK,CAAC;iBAC1B,IAAI,CAAC,MAAM,EAAC,gBAAgB,CAAC;iBAC7B,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;YAG3B,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC,SAAS,CAAC,YAAY,CAAC;iBAC5C,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC;iBAChB,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC;iBACtB,IAAI,CAAC,IAAI,EAAE,CAAC,CAAC;iBACb,IAAI,CAAC,IAAI,EAAE,UAAU,CAAK,IAAI,MAAM,CAAC,EAAE,CAAC,KAAK,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,GAAG,GAAG,CAAC,CAAC,CAAC,CAAC;iBAC1E,IAAI,CAAC,IAAI,EAAE,cAAc,CAAC;iBAC1B,IAAI,CAAC,IAAI,EAAE,UAAU,CAAK,IAAI,MAAM,CAAC,EAAE,CAAC,KAAK,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,GAAG,GAAG,CAAC,CAAC,CAAC,CAAC;iBAC1E,IAAI,CAAC,QAAQ,EAAE,MAAM,CAAC;iBACtB,IAAI,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC;YAG/B,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC,SAAS,CAAC,WAAW,CAAC;iBAC3C,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC;iBAChB,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC;iBACtB,IAAI,CAAC,UAAU,CAAK;gBACjB,EAAE,CAAA,CAAC,CAAC,CAAC,KAAK,CAAC;oBAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC;gBAC3B,MAAM,CAAC,EAAE,CAAC;YACd,CAAC,CAAC;iBACD,IAAI,CAAC,GAAG,EAAE,CAAC,EAAE,CAAC;iBACd,IAAI,CAAC,GAAG,EAAE,UAAU,CAAK;gBACtB,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,GAAG,EAAE,CAAC,CAAC;YAClC,CAAC,CAAC;iBACD,IAAI,CAAC,IAAI,EAAE,OAAO,CAAC;iBACnB,IAAI,CAAC,aAAa,EAAE,KAAK,CAAC;iBAC1B,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;YAG3B,IAAI,CAAC,SAAS,GAAG,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE;iBACzB,KAAK,CAAC,IAAI,CAAC,EAAE,CAAC;iBACd,MAAM,CAAC,QAAQ,CAAC;iBAChB,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,IAAI,EAAE,CAAC,CAAC;iBACtB,UAAU,CAAC,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;iBACnC,QAAQ,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;YAGpB,IAAI,IAAI,GAAO,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,GAAG,CAAC;iBACnC,IAAI,CAAC,WAAW,EAAE,cAAc,GAAG,UAAU,GAAG,GAAG,CAAC;iBAEpD,IAAI,CAAC,OAAO,EAAE,UAAU,CAAC;iBACzB,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC;iBACrB,KAAK,CAAC,QAAQ,EAAE,OAAO,CAAC,CAAC,KAAK,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC,KAAK,CAAC,MAAM,EAAE,OAAO,CAAC;iBAC1E,IAAI,CAAC,MAAM,EAAE,iBAAiB,CAAC;iBAC/B,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;YAC1B,IAAI,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,KAAK,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC,IAAI,CAAC,cAAc,EAAE,GAAG,CAAC,CAAC;YAG5E,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,GAAG,CAAC;iBACrC,IAAI,CAAC,WAAW,EAAE,YAAY,CAAC,CAAC;YAGrC,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC,SAAS,CAAC,WAAW,CAAC;iBAC3C,IAAI,CAAC,IAAI,CAAC,gBAAgB,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;iBAC1C,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC;iBACtB,IAAI,CAAC,OAAO,EAAE,UAAU,CAAK;gBAC1B,MAAM,CAAC,WAAW,GAAG,CAAC,CAAC,KAAK,CAAC;YACjC,CAAC,CAAC;iBACD,IAAI,CAAC,GAAG,EAAE,UAAU,CAAK;gBACtB,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC;YAClB,CAAC,CAAC;iBACD,IAAI,CAAC,QAAQ,EAAE,OAAO,CAAC;iBACvB,IAAI,CAAC,cAAc,EAAE,OAAO,CAAC,CAAC;YAGnC,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,MAAM,CAAC;iBACvB,IAAI,CAAC,gBAAgB,EAAE,SAAS,CAAC;iBACjC,IAAI,CAAC,OAAO,EAAE,cAAc,CAAC;iBAC7B,IAAI,CAAC,QAAQ,EAAE,UAAU,CAAC;iBAC1B,IAAI,CAAC,YAAY,EAAE,QAAQ,CAAC;iBAC5B,EAAE,CAAC,SAAS,EAAE,IAAI,CAAC,SAAS,CAAC,CAAC;YACnC,IAAI,CAAC,KAAK,GAAG,EAAE,CAAC,GAAG,CAAC,KAAK,EAAE;iBACtB,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC;iBACT,MAAM,CAAC,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;iBAC1B,EAAE,CAAC,OAAO,EAAE,IAAI,CAAC,WAAW,CAAC,CAAC;YACnC,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,GAAG,CAAC;iBACpB,IAAI,CAAC,OAAO,EAAE,SAAS,CAAC;iBACxB,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC;iBAChB,SAAS,CAAC,MAAM,CAAC;iBACjB,IAAI,CAAC,GAAG,EAAE,CAAC,CAAC;iBACZ,IAAI,CAAC,QAAQ,EAAE,UAAU,GAAG,CAAC,CAAC;iBAC9B,KAAK,CAAC,MAAM,EAAC,MAAM,CAAC;iBACpB,KAAK,CAAC,cAAc,EAAC,KAAK,CAAC;iBAC3B,KAAK,CAAC,QAAQ,EAAC,eAAe,CAAC;iBAC/B,KAAK,CAAC,cAAc,EAAC,CAAC,CAAC,CAAC;YAG7B,IAAI,CAAC,QAAQ,CAAC,SAAS,CAAC,iBAAiB,CAAC,CAAC,MAAM,EAAE,CAAC;YACpD,IAAI,CAAC,WAAW,EAAE,CAAC;YAGnB,EAAE,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC;gBACb,IAAI,UAAoB,CAAC;gBACzB,EAAE,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC;oBAAC,UAAU,GAAG,IAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC;gBACxD,IAAI,IAAI,GAAG,GAAG,CAAC,MAAM,CAAC,MAAM,CAAC;qBACxB,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC;qBAChB,IAAI,CAAC,GAAG,EAAE,CAAC,CAAC,CAAC,QAAQ,EAAE,GAAG,CAAC,CAAC,CAAC;qBAC7B,IAAI,CAAC,GAAG,EAAE,CAAC,CAAC,MAAM,CAAC,GAAG,GAAG,EAAE,CAAC,GAAG,CAAC,CAAC,CAAC;qBAClC,IAAI,CAAC,aAAa,EAAE,QAAQ,CAAC,CAAC;gBAEnC,EAAE,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC;oBACb,EAAE,CAAC,CAAC,UAAU,CAAC,OAAO,EAAE,CAAC;wBAAC,IAAI,CAAC,IAAI,CAAC,aAAa,EAAE,UAAU,CAAC,OAAO,CAAC,CAAC;oBACvE,EAAE,CAAC,CAAC,UAAU,CAAC,WAAW,EAAE,IAAI,IAAI,CAAC;wBAAC,IAAI,CAAC,IAAI,CAAC,WAAW,EAAE,UAAU,CAAC,WAAW,EAAE,GAAG,IAAI,CAAC,CAAC;oBAC9F,EAAE,CAAC,CAAC,UAAU,CAAC,YAAY,EAAE,IAAI,IAAI,CAAC;wBAAC,IAAI,CAAC,KAAK,CAAC,iBAAiB,EAAE,WAAW,CAAC,CAAC;oBAClF,EAAE,CAAC,CAAC,UAAU,CAAC,QAAQ,EAAE,CAAC;wBAAC,IAAI,CAAC,KAAK,CAAC,MAAM,EAAE,UAAU,CAAC,QAAQ,CAAC,CAAC;oBACnE,IAAI;wBAAC,IAAI,CAAC,KAAK,CAAC,MAAM,EAAE,cAAc,CAAC,mBAAmB,CAAC,CAAC;gBAChE,CAAC;gBAAC,IAAI,CAAC,CAAC;oBACJ,IAAI,CAAC,KAAK,CAAC,iBAAiB,EAAE,WAAW,CAAC,CAAC;oBAC3C,IAAI,CAAC,KAAK,CAAC,MAAM,EAAE,cAAc,CAAC,mBAAmB,CAAC,CAAC;gBAC3D,CAAC;YACL,CAAC;QACL,CAAC,CAAC;QAGF,gBAAW,GAAG;YACV,IAAI,QAAQ,GAAO,IAAI,CAAC;YAExB,IAAI,MAAM,GAAY,IAAI,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC;YAC1C,IAAI,SAAS,GAAU,MAAM,CAAC,CAAC,CAAC,CAAC;YACjC,IAAI,SAAS,GAAU,MAAM,CAAC,CAAC,CAAC,CAAC;YAEjC,IAAI,YAAY,GAAO,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC;gBACnD,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,SAAS,IAAI,CAAC,CAAC,GAAG,GAAG,SAAS,CAAA;YACnD,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,SAAS,EAAE,SAAS,CAAC,CAAC,CAAC,CAAC;YAE/E,IAAI,CAAC,EAAE,CAAC,MAAM,CAAC,CAAC,SAAS,EAAE,SAAS,CAAC,CAAC,CAAC;YAGvC,IAAI,KAAK,GAAG,SAAS,GAAG,SAAS,CAAC;YAClC,EAAE,CAAC,CAAC,KAAK,GAAG,CAAC,GAAG,aAAa,CAAC,iBAAiB,CAAC,CAAC,CAAC;gBAC9C,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,UAAU,CAAC,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC,CAAC;YACjF,CAAC;YAAC,IAAI,CAAC,EAAE,CAAC,CAAC,KAAK,GAAG,CAAC,GAAG,aAAa,CAAC,gBAAgB,CAAC,CAAC,CAAC;gBACpD,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,IAAI,EAAE,CAAC,CAAC,CAAC,UAAU,CAAC,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC,CAAC;YAC9E,CAAC;YAAC,IAAI,CAAC,EAAE,CAAC,CAAC,KAAK,GAAG,CAAC,GAAG,aAAa,CAAC,iBAAiB,CAAC,CAAC,CAAC;gBACrD,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,UAAU,CAAC,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC,CAAC;YAC/E,CAAC;YAAC,IAAI,CAAC,EAAE,CAAC,CAAC,KAAK,GAAG,CAAC,GAAG,aAAa,CAAC,mBAAmB,CAAC,CAAC,CAAC;gBACvD,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,UAAU,CAAC,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC,CAAC;YACjF,CAAC;YAAC,IAAI,CAAC,EAAE,CAAC,CAAC,KAAK,IAAI,KAAK,CAAC,CAAC,CAAC;gBACxB,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,OAAO,EAAE,EAAE,CAAC,CAAC,UAAU,CAAC,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC;YACrF,CAAC;YAAC,IAAI,CAAC,CAAC;gBACJ,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,UAAU,CAAC,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC;YACpF,CAAC;YAGD,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,WAAW,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;YAGvD,IAAI,KAAK,GAAO,IAAI,CAAC,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC;iBACvC,IAAI,CAAC,YAAY,EAAE,UAAU,CAAC,IAAI,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC;iBACjD,IAAI,CAAC,GAAG,EAAE,UAAU,CAAC,IAAI,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;iBACxD,IAAI,CAAC,OAAO,EAAE,UAAU,CAAC,IAAI,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,GAAG,CAAC,GAAG,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YAG3F,KAAK,CAAC,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC;iBACvB,IAAI,CAAC,GAAG,EAAE,UAAU,CAAC,IAAI,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;iBACxD,IAAI,CAAC,GAAG,EAAE,UAAU,CAAC,IAAI,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,IAAI,CAAC,GAAG,aAAa,CAAC,iCAAiC,GAAG,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,GAAG,GAAG,CAAC,CAAC,CAAC,CAAC;iBAChI,IAAI,CAAC,OAAO,EAAE,UAAU,CAAC,IAAI,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,GAAG,CAAC,GAAG,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;iBACjF,IAAI,CAAC,QAAQ,EAAE,UAAU,CAAC,IAAI,MAAM,CAAC,aAAa,CAAC,gCAAgC,GAAG,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;iBACxG,IAAI,CAAC,QAAQ,EAAE,OAAO,CAAC;iBACvB,IAAI,CAAC,MAAM,EAAE,UAAS,CAAC;gBACpB,EAAE,CAAA,CAAC,CAAC,CAAC,KAAK,CAAC;oBAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC;gBAC3B,MAAM,CAAC,aAAa,CAAC,aAAa,CAAC;YACvC,CAAC,CAAC;iBACD,IAAI,CAAC,cAAc,EAAE,CAAC,CAAC,CAAC;YAC7B,KAAK,CAAC,IAAI,EAAE,CAAC,MAAM,EAAE,CAAC;YAGtB,IAAI,MAAM,GAAO,IAAI,CAAC,SAAS,CAAC,SAAS,CAAC,MAAM,CAAC;iBAC5C,IAAI,CAAC,YAAY,EAAE,UAAU,CAAC;gBAC3B,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC;YAChB,CAAC,CAAC;iBACD,IAAI,CAAC,GAAG,EAAE,UAAU,CAAC;gBAClB,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,KAAK,EAAE,SAAS,CAAC,CAAC,GAAG,CAAC,CAAC;YACzD,CAAC,CAAC;iBACD,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;YAE3B,MAAM,CAAC,KAAK,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC;iBACxB,IAAI,CAAC,UAAU,CAAC;gBACb,EAAE,CAAA,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,GAAG,CAAC,GAAG,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,CAAC,IAAI,EAAE,CAAC;oBAAC,MAAM,CAAC,EAAE,CAAC;gBAC9D,EAAE,CAAA,CAAC,CAAC,CAAC,KAAK,CAAC;oBAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC;gBAC3B,MAAM,CAAC,EAAE,CAAC;YACd,CAAC,CAAC;iBACD,IAAI,CAAC,GAAG,EAAE,UAAU,CAAC;gBAClB,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,KAAK,EAAE,SAAS,CAAC,CAAC,GAAG,CAAC,CAAC;YACzD,CAAC,CAAC;iBACD,IAAI,CAAC,GAAG,EAAE,UAAU,CAAC;gBAClB,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,IAAI,CAAC,GAAG,EAAE,GAAG,QAAQ,CAAC,EAAE,CAAC,CAAC,CAAC,GAAG,GAAG,CAAC;YAC3D,CAAC,CAAC;iBACD,IAAI,CAAC,aAAa,EAAE,OAAO,CAAC;iBAC5B,IAAI,CAAC,OAAO,EAAE,WAAW,CAAC;iBAC1B,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;YAE3B,MAAM,CAAC,IAAI,EAAE,CAAC,MAAM,EAAE,CAAC;QAC3B,CAAC,CAAC;QAEF,cAAS,GAAG;YACR,IAAI,MAAM,GAAO,EAAE,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC;YACxC,IAAI,IAAI,GAAQ,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,OAAO,EAAE,CAAC;YACnD,IAAI,UAAU,GAAW,CAAC,IAAI,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,CAAC,CAAC,CAAC,OAAO,EAAE,GAAG,IAAI,CAAC,KAAK,CAAC,MAAM,EAAE,CAAC,CAAC,CAAC,CAAC,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;YAEnG,IAAI,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,IAAI,IAAI,CAAC,IAAI,GAAG,UAAU,CAAC,EAAE,IAAI,IAAI,CAAC,IAAI,GAAG,UAAU,CAAC,CAAC,CAAC,CAAC;YAC9E,IAAI,CAAC,WAAW,EAAE,CAAC;QACvB,CAAC,CAAC;QAEF,qBAAgB,GAAG,UAAC,KAAS;YACzB,IAAI,KAAK,GAAG,EAAE,EAAE,CAAC,EAAE,MAAM,GAAG,EAAE,GAAG,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,GAAG,GAAG,EAAE,MAAM,GAAG,EAAE,CAAC;YAC/D,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;gBACpC,CAAC,GAAG,KAAK,CAAC,CAAC,CAAC,CAAC;gBACb,EAAE,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC;oBAAC,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC;gBACzC,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,IAAI,CAAC,GAAG,MAAM,CAAC,EAAE,GAAG,EAAE,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;YACvG,CAAC;YAED,GAAG,CAAC,CAAC,IAAI,SAAS,IAAI,KAAK,CAAC,CAAC,CAAC;gBAC1B,MAAM,CAAC,IAAI,CAAC,EAAC,KAAK,EAAE,SAAS,EAAE,IAAI,EAAE,KAAK,CAAC,SAAS,CAAC,EAAC,CAAC,CAAC;YAC5D,CAAC;YACD,MAAM,CAAC,MAAM,CAAC;QAClB,CAAC,CAAA;QA3UG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAC,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,aAAa,CAAC,CAAC,CAAC;QAEpF,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,WAAW,CAAC,CAAC;QACnC,IAAI,CAAC,QAAQ,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC;IACrC,CAAC;IApBc,iCAAmB,GAAG,EAAE,CAAC;IACzB,+CAAiC,GAAU,IAAI,CAAC;IAChD,8CAAgC,GAAU,IAAI,CAAC;IAE/C,iCAAmB,GAAU,EAAE,GAAG,IAAI,CAAC;IACvC,+BAAiB,GAAU,EAAE,GAAG,aAAa,CAAC,mBAAmB,CAAC;IAClE,8BAAgB,GAAU,EAAE,GAAG,aAAa,CAAC,iBAAiB,CAAC;IAC/D,+BAAiB,GAAU,CAAC,GAAG,aAAa,CAAC,gBAAgB,CAAC;IAE9D,2BAAa,GAAG,WAAW,CAAC;IAkV/C,oBAAC;AAAD,CAAC,AA/WD,CAA4B,KAAK,GA+WhC;ACpXD;IAAyB,8BAAK;IAQ1B,oBAAa,OAAY;QAR7B,iBAgCC;QAvBO,kBAAM,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC;QAYjC,mBAAc,GAAG,cAAM,OAAA,KAAI,CAAC,WAAW,EAAhB,CAAgB,CAAC;QACxC,iBAAY,GAAG,cAAM,OAAA,KAAI,CAAC,SAAS,EAAd,CAAc,CAAC;QACpC,oBAAe,GAAG,cAAM,OAAA,KAAI,CAAC,YAAY,EAAjB,CAAiB,CAAC;QAE1C,mBAAc,GAAG,UAAC,GAAW;YACzB,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,YAAY,IAAI,GAAG,GAAG,CAAC,IAAI,GAAG,GAAG,IAAI,CAAC,YAAY,CAAC,MAAM,CAAC;gBAAC,MAAM,CAAC,IAAI,CAAC;YAChF,MAAM,CAAC,KAAI,CAAC,YAAY,CAAC,GAAG,CAAC,CAAC;QAClC,CAAC,CAAC;QAEF,uBAAkB,GAAG,cAAM,OAAA,KAAI,CAAC,eAAe,EAApB,CAAoB,CAAC;QAChD,kBAAa,GAAG,cAAM,OAAA,KAAI,CAAC,UAAU,EAAf,CAAe,CAAC;QApBlC,IAAI,KAAK,GAAQ,OAAO,CAAC,YAAY,CAAC,CAAC;QAEvC,EAAE,CAAA,CAAC,KAAK,CAAC,CAAA,CAAC;YACN,IAAI,CAAC,WAAW,GAAG,KAAK,CAAC,aAAa,CAAC,CAAC;YACxC,IAAI,CAAC,SAAS,GAAG,KAAK,CAAC,WAAW,CAAC,CAAC;YACpC,IAAI,CAAC,YAAY,GAAG,KAAK,CAAC,cAAc,CAAC,CAAC;YAC1C,EAAE,CAAA,CAAC,KAAK,CAAC,YAAY,CAAC,CAAC;gBAAC,IAAI,CAAC,UAAU,GAAG,IAAI,SAAS,CAAC,KAAK,CAAC,YAAY,CAAC,CAAC,CAAC;QACjF,CAAC;IACL,CAAC;IAaL,iBAAC;AAAD,CAAC,AAhCD,CAAyB,KAAK,GAgC7B;AC/BD;IAA2B,gCAAS;IAKhC,sBAAY,OAAe;QACvB,kBAAM,aAAa,CAAC,YAAY,CAAC,CAAC;QAoBtC,WAAM,GAAG,UAAC,cAAsB;YAE5B,IAAI,MAAM,GAAW,CAAC,CAAC,aAAa,CAAC,CAAC;YACtC,MAAM,CAAC,QAAQ,EAAE,CAAC;YAElB,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC,CAAA,CAAC;gBAEX,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC,QAAQ,EAAE,CAAC,CAAA,CAAC;oBACtB,IAAI,IAAI,GAAW,IAAI,CAAC,KAAK,CAAC,YAAY,EAAE,CAAC;oBAC7C,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,KAAK,CAAC,QAAQ,EAAE,GAAG,CAAC,IAAI,GAAG,IAAI,GAAG,EAAE,CAAC,CAAC,CAAC;gBAC7D,CAAC;gBACD,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC,SAAS,EAAE,CAAC,CAAA,CAAC;oBACvB,IAAI,IAAI,GAAW,IAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC;oBAC9C,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,SAAS,EAAE,GAAG,CAAC,IAAI,GAAG,IAAI,GAAG,EAAE,CAAC,CAAC,CAAC;gBAC/D,CAAC;gBACD,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC,kBAAkB,EAAE,CAAC;oBAAC,MAAM,CAAC,GAAG,CAAC,kBAAkB,EAAC,IAAI,CAAC,KAAK,CAAC,kBAAkB,EAAE,CAAC,CAAC;gBACnG,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC;oBAAC,MAAM,CAAC,GAAG,CAAC,OAAO,EAAE,IAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC,CAAC;YACnF,CAAC;YAGD,cAAc,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC;YAG9B,EAAE,CAAA,CAAC,IAAI,CAAC,UAAU,CAAC,CAAA,CAAC;gBAChB,GAAG,CAAA,CAAE,IAAI,CAAC,GAAC,CAAC,EAAE,CAAC,GAAC,IAAI,CAAC,UAAU,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;oBAC1C,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC;gBACtC,CAAC;YACL,CAAC;QACL,CAAC,CAAA;QA9CG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,YAAY,CAAC,CAAC,CAAC;QAElF,IAAI,UAAU,GAAU,IAAI,CAAC,YAAY,CAAC,CAAC;QAE3C,EAAE,CAAA,CAAC,UAAU,CAAC,CAAA,CAAC;YACX,IAAI,CAAC,UAAU,GAAG,EAAE,CAAC;YACrB,GAAG,CAAA,CAAE,IAAI,CAAC,GAAC,CAAC,EAAE,CAAC,GAAC,UAAU,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;gBACrC,IAAI,KAAK,GAAW,IAAI,CAAC,SAAS,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;gBAClD,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,SAAS,CAAC,YAAY,CAAC,KAAK,CAAC,CAAC,CAAC;YACxD,CAAC;QACL,CAAC;QAED,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;YAAC,IAAI,CAAC,KAAK,GAAG,IAAI,QAAQ,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC;IAG/D,CAAC;IAgCL,mBAAC;AAAD,CAAC,AAxDD,CAA2B,SAAS,GAwDnC;ACxDD;IAAuB,4BAAK;IAIxB,kBAAa,OAAY;QAJ7B,iBAcC;QATO,kBAAM,OAAO,CAAC,UAAU,CAAC,CAAC,CAAC;QAM/B,kBAAa,GAAG,cAAM,OAAA,KAAI,CAAC,UAAU,EAAf,CAAe,CAAC;QAJlC,EAAE,CAAA,CAAC,OAAO,IAAI,OAAO,CAAC,UAAU,CAAC,CAAC;YAAC,IAAI,CAAC,UAAU,GAAG,OAAO,CAAC,UAAU,CAAC,CAAC,YAAY,CAAC,CAAC;IAE3F,CAAC;IAKL,eAAC;AAAD,CAAC,AAdD,CAAuB,KAAK,GAc3B;ACRD;IAAiC,sCAAS;IAOtC,4BAAY,OAAe;QACvB,kBAAM,aAAa,CAAC,kBAAkB,CAAC,CAAC;QAqB5C,WAAM,GAAG,UAAC,cAAsB;YAE5B,IAAI,CAAC,GAAkB,IAAI,CAAC,KAAK,CAAC;YAElC,IAAI,QAAQ,GAAW,CAAC,CAAC,aAAa,CAAC,CAAC;YACxC,QAAQ,CAAC,QAAQ,EAAE,CAAC;YAEpB,IAAI,QAAgB,CAAC;YACrB,EAAE,CAAA,CAAC,IAAI,CAAC,KAAK,CAAC;gBAAC,QAAQ,GAAG,CAAC,CAAC,OAAO,GAAG,IAAI,CAAC,KAAK,GAAG,QAAQ,CAAC,CAAC;YAC7D,IAAI;gBAAC,QAAQ,GAAG,CAAC,CAAC,aAAa,CAAC,CAAC;YACjC,QAAQ,CAAC,QAAQ,EAAE,CAAC;YACpB,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC;YAE1B,IAAI,QAAQ,GAAW,CAAC,CAAC,aAAa,CAAC,CAAC;YACxC,QAAQ,CAAC,QAAQ,EAAE,CAAC;YACpB,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC;YAG1B,EAAE,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC;gBACvB,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,eAAe,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;oBAEnD,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC;gBAC7C,CAAC;YACL,CAAC;YAED,cAAc,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC;YAEhC,EAAE,CAAA,CAAC,IAAI,CAAC,gBAAgB,CAAC;gBAAC,QAAQ,CAAC,SAAS,CAAC,EAAC,WAAW,EAAE,IAAI,EAAE,WAAW,EAAE,SAAS,EAAE,MAAM,EAAE,KAAK,EAAC,CAAC,CAAC;YACzG,IAAI;gBAAC,QAAQ,CAAC,SAAS,CAAC,EAAC,WAAW,EAAE,IAAI,EAAE,WAAW,EAAE,SAAS,EAAC,CAAC,CAAC;QASzE,CAAC,CAAA;QAxDG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,kBAAkB,CAAC,CAAC,CAAC;QAExF,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,OAAO,CAAC,CAAC;QAC3B,IAAI,CAAC,gBAAgB,GAAG,IAAI,CAAC,kBAAkB,CAAC,CAAC;QAEjD,IAAI,OAAO,GAAU,IAAI,CAAC,iBAAiB,CAAC,CAAC;QAE7C,EAAE,CAAA,CAAC,OAAO,CAAC,CAAA,CAAC;YACR,IAAI,CAAC,eAAe,GAAG,EAAE,CAAC;YAC1B,GAAG,CAAA,CAAE,IAAI,CAAC,GAAC,CAAC,EAAE,CAAC,GAAC,OAAO,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;gBAClC,IAAI,KAAK,GAAW,IAAI,CAAC,SAAS,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,CAAC;gBAC/C,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,YAAY,CAAC,KAAK,CAAC,CAAC,CAAC;YAC7D,CAAC;QACL,CAAC;QAED,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;YAAC,IAAI,CAAC,KAAK,GAAG,IAAI,cAAc,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC;IACrE,CAAC;IA0CL,yBAAC;AAAD,CAAC,AArED,CAAiC,SAAS,GAqEzC;AC3ED;IAA6B,kCAAK;IAE9B,wBAAa,OAAY;QACrB,kBAAM,OAAO,CAAC,gBAAgB,CAAC,CAAC,CAAC;IAGrC,CAAC;IAEL,qBAAC;AAAD,CAAC,AARD,CAA6B,KAAK,GAQjC;ACLD;IAA6B,kCAAS;IAOlC,wBAAY,OAAe;QACvB,kBAAM,aAAa,CAAC,cAAc,CAAC,CAAC;QAUxC,WAAM,GAAG,UAAC,cAAsB;YAE5B,IAAI,CAAC,GAAe,IAAI,CAAC,KAAK,CAAC;YAC/B,IAAI,MAAM,GAAW,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC;YAEzC,IAAI,GAAG,GAAG,QAAQ,CAAC,aAAa,CAAC,OAAO,CAAC,CAAC;YAE1C,GAAG,CAAC,KAAK,CAAC,KAAK,GAAG,MAAM,CAAC;YACzB,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,CAAC,gBAAgB,EAAE,IAAI,IAAK,CAAC;gBAAC,GAAG,CAAC,YAAY,CAAC,QAAQ,EAAE,MAAM,CAAC,CAAC,CAAC,gBAAgB,EAAE,CAAC,CAAC,CAAC;YAChG,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,CAAC,kBAAkB,EAAE,CAAC;gBAAC,GAAG,CAAC,KAAK,CAAC,eAAe,GAAG,CAAC,CAAC,kBAAkB,EAAE,CAAC;YACnF,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,CAAC,iBAAiB,EAAE,CAAC;gBAAC,GAAG,CAAC,KAAK,CAAC,UAAU,GAAG,CAAC,CAAC,iBAAiB,EAAE,CAAC;YAE5E,EAAE,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,eAAe,EAAE,CAAC,CAAC,CAAC;gBAE3B,IAAI,SAAS,GAAa,CAAC,CAAC,eAAe,EAAE,CAAC;gBAC9C,IAAI,IAAI,GAAW,OAAO,CAAC,mBAAmB,CAAC,CAAC,CAAC,kBAAkB,EAAE,CAAC,CAAC;gBACvE,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,SAAS,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;oBACxC,IAAI,GAAG,GAAG,QAAQ,CAAC,aAAa,CAAC,KAAK,CAAC,CAAC;oBACxC,GAAG,CAAC,YAAY,CAAC,OAAO,EAAE,SAAS,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,CAAC;oBAC/C,GAAG,CAAC,WAAW,CAAC,GAAG,CAAC,CAAC;gBACzB,CAAC;YACL,CAAC;YAGD,IAAI,MAAM,GAAG,CAAC,CAAC;YACf,IAAI,QAAQ,GAAG,CAAC,CAAC;YACjB,IAAI,SAAS,GAAG,CAAC,CAAC;YAClB,IAAI,OAAO,GAAG,CAAC,CAAC;YAEhB,EAAE,CAAC,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC;gBACd,IAAI,OAAO,GAAG,QAAQ,CAAC,aAAa,CAAC,OAAO,CAAC,CAAC;gBAC9C,IAAI,SAAS,GAAG,QAAQ,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC;gBAE7C,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,CAAC,cAAc,EAAE,CAAC;oBAAC,SAAS,CAAC,KAAK,CAAC,eAAe,GAAG,CAAC,CAAC,cAAc,EAAE,CAAC;gBAEjF,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,MAAM,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;oBAC1C,IAAI,OAAO,GAAG,QAAQ,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC;oBAC3C,OAAO,CAAC,KAAK,CAAC,OAAO,GAAG,MAAM,GAAG,KAAK,GAAG,QAAQ,GAAG,KAAK,GAAG,SAAS,GAAG,KAAK,GAAG,OAAO,GAAG,IAAI,CAAC;oBAC/F,OAAO,CAAC,WAAW,CAAC,QAAQ,CAAC,cAAc,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;oBAC7D,SAAS,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;gBACnC,CAAC;gBACD,GAAG,CAAC,WAAW,CAAC,SAAS,CAAC,CAAC;YAC/B,CAAC;YAGD,EAAE,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC;gBAEf,IAAI,IAAI,GAAG,QAAQ,CAAC,aAAa,CAAC,OAAO,CAAC,CAAC;gBAC3C,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;oBAC3C,IAAI,EAAE,GAAG,QAAQ,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC;oBAEtC,GAAG,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE,CAAC;wBAC9C,IAAI,EAAE,GAAG,QAAQ,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC;wBACtC,EAAE,CAAC,KAAK,CAAC,OAAO,GAAG,MAAM,GAAG,KAAK,GAAG,QAAQ,GAAG,KAAK,GAAG,SAAS,GAAG,KAAK,GAAG,OAAO,GAAG,IAAI,CAAC;wBAC1F,EAAE,CAAC,WAAW,CAAC,QAAQ,CAAC,cAAc,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;wBAC5D,EAAE,CAAC,WAAW,CAAC,EAAE,CAAC,CAAC;oBACvB,CAAC;oBAED,IAAI,CAAC,WAAW,CAAC,EAAE,CAAC,CAAC;gBACzB,CAAC;gBACD,GAAG,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;YAC1B,CAAC;YAED,cAAc,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;QAC/B,CAAC,CAAA;QAxEG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,cAAc,CAAC,CAAC,CAAC;QAEpF,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC;QAC7B,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;YAAC,IAAI,CAAC,KAAK,GAAG,IAAI,UAAU,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC;IACjE,CAAC;IAqEL,qBAAC;AAAD,CAAC,AArFD,CAA6B,SAAS,GAqFrC;ACzFD;IAAyB,8BAAK;IAQ1B,oBAAa,OAAY;QAR7B,iBA0BC;QAjBO,kBAAM,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC;QAYjC,oBAAe,GAAG,cAAM,OAAA,KAAI,CAAC,YAAY,EAAjB,CAAiB,CAAC;QAC1C,uBAAkB,GAAG,cAAM,OAAA,KAAI,CAAC,eAAe,EAApB,CAAoB,CAAC;QAChD,qBAAgB,GAAG,cAAM,OAAA,KAAI,CAAC,aAAa,EAAlB,CAAkB,CAAC;QAC5C,mBAAc,GAAG,cAAM,OAAA,KAAI,CAAC,WAAW,EAAhB,CAAgB,CAAC;QACxC,sBAAiB,GAAG,cAAM,OAAA,KAAI,CAAC,cAAc,EAAnB,CAAmB,CAAC;QAd1C,IAAI,KAAK,GAAQ,OAAO,CAAC,YAAY,CAAC,CAAC;QACvC,EAAE,CAAA,CAAC,KAAK,CAAC,CAAA,CAAC;YACN,IAAI,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC,CAAC,cAAc,CAAC,CAAC;YAC1D,IAAI,CAAC,aAAa,GAAG,OAAO,CAAC,YAAY,CAAC,CAAC,eAAe,CAAC,CAAC;YAC5D,IAAI,CAAC,WAAW,GAAG,OAAO,CAAC,YAAY,CAAC,CAAC,aAAa,CAAC,CAAC;YACxD,IAAI,CAAC,eAAe,GAAG,OAAO,CAAC,YAAY,CAAC,CAAC,iBAAiB,CAAC,CAAC;YAChE,IAAI,CAAC,cAAc,GAAG,OAAO,CAAC,YAAY,CAAC,CAAC,gBAAgB,CAAC,CAAC;QAClE,CAAC;IACL,CAAC;IAOL,iBAAC;AAAD,CAAC,AA1BD,CAAyB,KAAK,GA0B7B;ACtBD;IAA4B,iCAAS;IAKjC,uBAAY,OAAe;QAL/B,iBAwCC;QAlCO,kBAAM,aAAa,CAAC,aAAa,CAAC,CAAC;QASvC,WAAM,GAAG,UAAC,cAAsB;YAE5B,IAAI,QAAQ,GAAS,QAAQ,CAAC,cAAc,CAAC,KAAI,CAAC,IAAI,CAAC,CAAC;YACxD,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,CAAA,CAAC;gBACX,IAAI,OAAO,GAAoB,QAAQ,CAAC,aAAa,CAAC,MAAM,CAAC,CAAC;gBAC9D,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,OAAO,EAAE,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,IAAI,GAAG,KAAI,CAAC,KAAK,CAAC,OAAO,EAAE,CAAC;gBACnE,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,WAAW,EAAE,IAAI,IAAI,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,QAAQ,GAAG,KAAI,CAAC,KAAK,CAAC,WAAW,EAAE,GAAG,IAAI,CAAC;gBAC9F,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,YAAY,EAAE,IAAI,IAAI,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,cAAc,GAAC,WAAW,CAAC;gBAC/E,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,QAAQ,EAAE,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,KAAK,GAAG,KAAI,CAAC,KAAK,CAAC,QAAQ,EAAE,CAAC;gBACtE,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,YAAY,EAAE,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,SAAS,GAAG,KAAI,CAAC,KAAK,CAAC,YAAY,EAAE,GAAG,IAAI,CAAC;gBACzF,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,eAAe,EAAE,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,YAAY,GAAG,KAAI,CAAC,KAAK,CAAC,eAAe,EAAE,GAAG,IAAI,CAAC;gBAClG,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,aAAa,EAAE,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,UAAU,GAAG,KAAI,CAAC,KAAK,CAAC,aAAa,EAAE,GAAG,IAAI,CAAC;gBAC5F,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,cAAc,EAAE,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,WAAW,GAAG,KAAI,CAAC,KAAK,CAAC,cAAc,EAAE,GAAG,IAAI,CAAC;gBAC/F,EAAE,CAAA,CAAC,KAAI,CAAC,KAAK,CAAC,gBAAgB,EAAE,CAAC;oBAAC,OAAO,CAAC,KAAK,CAAC,UAAU,GAAG,KAAK,CAAC;gBAEnE,OAAO,CAAC,WAAW,CAAC,QAAQ,CAAC,CAAC;gBAC9B,cAAc,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;YACnC,CAAC;YAAC,IAAI,CAAC,CAAC;gBACJ,IAAI,OAAO,GAAoB,QAAQ,CAAC,aAAa,CAAC,MAAM,CAAC,CAAC;gBAE9D,OAAO,CAAC,WAAW,CAAC,QAAQ,CAAC,CAAC;gBAC9B,cAAc,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;YACnC,CAAC;QACL,CAAC,CAAA;QA/BG,IAAI,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC/B,EAAE,CAAA,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAAC,IAAI,GAAG,IAAI,CAAC,aAAa,CAAC,aAAa,CAAC,aAAa,CAAC,CAAC,CAAC;QAEnF,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,CAAC;QAEzB,EAAE,CAAA,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;YAAC,IAAI,CAAC,KAAK,GAAG,IAAI,SAAS,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC;IAChE,CAAC;IA2BL,oBAAC;AAAD,CAAC,AAxCD,CAA4B,SAAS,GAwCpC;AC5CD;IAAwB,6BAAK;IAQzB,mBAAa,OAAY;QAR7B,iBA0BC;QAjBO,kBAAM,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC;QAYhC,YAAO,GAAG,cAAM,OAAA,KAAI,CAAC,IAAI,EAAT,CAAS,CAAC;QAC1B,gBAAW,GAAG,cAAM,OAAA,KAAI,CAAC,QAAQ,EAAb,CAAa,CAAC;QAClC,iBAAY,GAAG,cAAM,OAAA,KAAI,CAAC,SAAS,EAAd,CAAc,CAAC;QACpC,aAAQ,GAAG,cAAM,OAAA,KAAI,CAAC,KAAK,EAAV,CAAU,CAAC;QAC5B,qBAAgB,GAAG,cAAM,OAAA,KAAI,CAAC,aAAa,EAAlB,CAAkB,CAAC;QAdxC,IAAI,KAAK,GAAQ,OAAO,CAAC,WAAW,CAAC,CAAC;QACtC,EAAE,CAAA,CAAC,KAAK,CAAC,CAAA,CAAC;YACN,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC,MAAM,CAAC,CAAC;YAC1B,IAAI,CAAC,QAAQ,GAAG,KAAK,CAAC,UAAU,CAAC,CAAC;YAClC,IAAI,CAAC,SAAS,GAAG,KAAK,CAAC,WAAW,CAAC,CAAC;YACpC,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC,OAAO,CAAC,CAAC;YAC5B,IAAI,CAAC,aAAa,GAAG,KAAK,CAAC,eAAe,CAAC,CAAC;QAChD,CAAC;IACL,CAAC;IAOL,gBAAC;AAAD,CAAC,AA1BD,CAAwB,KAAK,GA0B5B"} \ No newline at end of file diff --git a/contrib/attic/arbiter/arbiter-ui/src/main/resources/templates/ArbiterUI.html b/contrib/attic/arbiter/arbiter-ui/src/main/resources/templates/ArbiterUI.html deleted file mode 100644 index 00cdc7b60..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/main/resources/templates/ArbiterUI.html +++ /dev/null @@ -1,641 +0,0 @@ - - - - - - DL4J - Arbiter UI - - - - - - - - - - - - - - - - - - - - -
-
Deeplearning4J - Arbiter UI
- -
- - -
-
-
-

Summary

-
-
-
-
- -
-
-

Optimization Settings

-
-
-
- - -
-
Results
-
- - - - - - -
-
-
- -
-
-

Selected Result

-
-
-
-
- - \ No newline at end of file diff --git a/contrib/attic/arbiter/arbiter-ui/src/test/java/org/deeplearning4j/arbiter/optimize/AssertTestsExtendBaseClass.java b/contrib/attic/arbiter/arbiter-ui/src/test/java/org/deeplearning4j/arbiter/optimize/AssertTestsExtendBaseClass.java deleted file mode 100644 index e78a99079..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/test/java/org/deeplearning4j/arbiter/optimize/AssertTestsExtendBaseClass.java +++ /dev/null @@ -1,52 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.arbiter.optimize; - -import lombok.extern.slf4j.Slf4j; -import org.nd4j.common.tests.AbstractAssertTestsClass; -import org.deeplearning4j.BaseDL4JTest; - -import java.util.*; - -/** - * This class checks that all test classes (i.e., anything with one or more methods annotated with @Test) - * extends BaseDl4jTest - either directly or indirectly. - * Other than a small set of exceptions, all tests must extend this - * - * @author Alex Black - */ - -@Slf4j -public class AssertTestsExtendBaseClass extends AbstractAssertTestsClass { - - @Override - protected Set> getExclusions() { - //Set of classes that are exclusions to the rule (either run manually or have their own logging + timeouts) - return new HashSet<>(); - } - - @Override - protected String getPackageName() { - return "org.deeplearning4j.arbiter.optimize"; - } - - @Override - protected Class getBaseClass() { - return BaseDL4JTest.class; - } -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/test/java/org/deeplearning4j/arbiter/optimize/TestBasic.java b/contrib/attic/arbiter/arbiter-ui/src/test/java/org/deeplearning4j/arbiter/optimize/TestBasic.java deleted file mode 100644 index 82b06d7f4..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/test/java/org/deeplearning4j/arbiter/optimize/TestBasic.java +++ /dev/null @@ -1,793 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.arbiter.optimize; - -import io.netty.handler.codec.http.HttpResponseStatus; -import lombok.NonNull; -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.core.storage.StatsStorage; -import org.deeplearning4j.arbiter.ComputationGraphSpace; -import org.deeplearning4j.arbiter.MultiLayerSpace; -import org.deeplearning4j.arbiter.conf.updater.SgdSpace; -import org.deeplearning4j.arbiter.layers.ConvolutionLayerSpace; -import org.deeplearning4j.arbiter.layers.DenseLayerSpace; -import org.deeplearning4j.arbiter.layers.OutputLayerSpace; -import org.deeplearning4j.arbiter.optimize.api.CandidateGenerator; -import org.deeplearning4j.arbiter.optimize.api.ParameterSpace; -import org.deeplearning4j.arbiter.optimize.api.data.DataProvider; -import org.deeplearning4j.arbiter.optimize.api.data.DataSource; -import org.deeplearning4j.arbiter.optimize.api.score.ScoreFunction; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxCandidatesCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.MaxTimeCondition; -import org.deeplearning4j.arbiter.optimize.api.termination.TerminationCondition; -import org.deeplearning4j.arbiter.optimize.config.OptimizationConfiguration; -import org.deeplearning4j.arbiter.optimize.generator.RandomSearchGenerator; -import org.deeplearning4j.arbiter.optimize.parameter.continuous.ContinuousParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.discrete.DiscreteParameterSpace; -import org.deeplearning4j.arbiter.optimize.parameter.integer.IntegerParameterSpace; -import org.deeplearning4j.arbiter.optimize.runner.IOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.LocalOptimizationRunner; -import org.deeplearning4j.arbiter.optimize.runner.listener.StatusListener; -import org.deeplearning4j.arbiter.saver.local.FileModelSaver; -import org.deeplearning4j.arbiter.scoring.impl.EvaluationScoreFunction; -import org.deeplearning4j.arbiter.scoring.impl.TestSetLossScoreFunction; -import org.deeplearning4j.arbiter.task.ComputationGraphTaskCreator; -import org.deeplearning4j.arbiter.task.MultiLayerNetworkTaskCreator; -import org.deeplearning4j.arbiter.ui.listener.ArbiterStatusListener; -import org.deeplearning4j.datasets.iterator.EarlyTerminationDataSetIterator; -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator; -import org.deeplearning4j.nn.conf.inputs.InputType; -import org.deeplearning4j.nn.weights.WeightInit; -import org.deeplearning4j.ui.api.UIServer; -import org.deeplearning4j.ui.model.storage.InMemoryStatsStorage; -import org.junit.Ignore; -import org.junit.Test; -import org.nd4j.common.function.Function; -import org.nd4j.evaluation.classification.Evaluation; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.linalg.lossfunctions.LossFunctions; - -import java.io.File; -import java.io.IOException; -import java.io.UnsupportedEncodingException; -import java.net.HttpURLConnection; -import java.net.URL; -import java.net.URLEncoder; -import java.util.*; -import java.util.concurrent.TimeUnit; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertTrue; - -/** - * Created by Alex on 19/07/2017. - */ -@Slf4j -public class TestBasic extends BaseDL4JTest { - - @Override - public long getTimeoutMilliseconds() { - return 3600_000L; - } - - @Test - @Ignore - public void testBasicUiOnly() throws Exception { - - UIServer.getInstance(); - - Thread.sleep(1000_000); - } - - @Test - @Ignore - public void testBasicMnist() throws Exception { - Nd4j.setDefaultDataTypes(DataType.FLOAT, DataType.FLOAT); - - MultiLayerSpace mls = getMultiLayerSpaceMnist(); - Map commands = new HashMap<>(); -// commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, commands); - DataProvider dataProvider = new MnistDataSetProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestBasicMnist\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(120, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - IOptimizationRunner runner = - new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - - StatsStorage ss = new InMemoryStatsStorage(); - StatusListener sl = new ArbiterStatusListener(ss); - runner.addListeners(sl); - - UIServer.getInstance().attach(ss); - - runner.execute(); - Thread.sleep(1000_000); - } - - private static MultiLayerSpace getMultiLayerSpaceMnist() { - return new MultiLayerSpace.Builder() - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.05)) - .addLayer( - new ConvolutionLayerSpace.Builder().nIn(1) - .nOut(new IntegerParameterSpace(5, 30)) - .kernelSize(new DiscreteParameterSpace<>(new int[]{3, 3}, - new int[]{4, 4}, new int[]{5, 5})) - .stride(new DiscreteParameterSpace<>(new int[]{1, 1}, - new int[]{2, 2})) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.SOFTPLUS, Activation.LEAKYRELU)) - .build()) - .addLayer(new DenseLayerSpace.Builder().nOut(new IntegerParameterSpace(32, 128)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), new IntegerParameterSpace(0, 1), true) //0 to 1 layers - .addLayer(new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .setInputType(InputType.convolutionalFlat(28, 28, 1)) - .build(); - } - - @Test - @Ignore - public void testBasicMnistDataSource() throws InterruptedException { - ParameterSpace learningRateHyperparam = new ContinuousParameterSpace(0.0001, 0.1); - ParameterSpace layerSizeHyperparam = new IntegerParameterSpace(16, 256); - - MultiLayerSpace hyperparameterSpace = new MultiLayerSpace.Builder() - .weightInit(WeightInit.XAVIER) - .l2(0.0001) - .updater(new SgdSpace(learningRateHyperparam)) - .addLayer(new DenseLayerSpace.Builder() - .nIn(784) - .activation(Activation.LEAKYRELU) - .nOut(layerSizeHyperparam) - .build()) - .addLayer(new OutputLayerSpace.Builder() - .nOut(10) - .activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT) - .build()) - .build(); - CandidateGenerator candidateGenerator = new RandomSearchGenerator(hyperparameterSpace, null); - ScoreFunction scoreFunction = new EvaluationScoreFunction(Evaluation.Metric.ACCURACY); - TerminationCondition[] terminationConditions = { - new MaxTimeCondition(5, TimeUnit.MINUTES), - new MaxCandidatesCondition(2)}; - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestBasicMnist\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - Class ds = MnistDataSource.class; - Properties dsp = new Properties(); - dsp.setProperty("minibatch", "8"); - OptimizationConfiguration configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataSource(ds, dsp) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(scoreFunction) - .terminationConditions(terminationConditions) - .build(); - - IOptimizationRunner runner = new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - - StatsStorage ss = new InMemoryStatsStorage(); - StatusListener sl = new ArbiterStatusListener(ss); - runner.addListeners(sl); - - UIServer.getInstance().attach(ss); - - runner.execute(); - Thread.sleep(90000); - } - - - @Test - @Ignore - public void testBasicMnistCompGraph() throws Exception { - - ComputationGraphSpace cgs = new ComputationGraphSpace.Builder() - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.05)) - .addInputs("in") - .addLayer("0", - new ConvolutionLayerSpace.Builder().nIn(1) - .nOut(new IntegerParameterSpace(5, 30)) - .kernelSize(new DiscreteParameterSpace<>(new int[]{3, 3}, - new int[]{4, 4}, new int[]{5, 5})) - .stride(new DiscreteParameterSpace<>(new int[]{1, 1}, - new int[]{2, 2})) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.SOFTPLUS, Activation.LEAKYRELU)) - .build(), "in") - .addLayer("1", new DenseLayerSpace.Builder().nOut(new IntegerParameterSpace(32, 128)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), "0") - .addLayer("out", new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "1") - .setOutputs("out") - .setInputTypes(InputType.convolutionalFlat(28, 28, 1)) - .build(); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(cgs); - DataProvider dataProvider = new MnistDataSetProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestBasicMnistCG\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(120, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - IOptimizationRunner runner = - new LocalOptimizationRunner(configuration, new ComputationGraphTaskCreator()); - - StatsStorage ss = new InMemoryStatsStorage(); - StatusListener sl = new ArbiterStatusListener(ss); - runner.addListeners(sl); - - UIServer.getInstance().attach(ss); - - runner.execute(); - Thread.sleep(100000); - } - - - @Test - @Ignore - public void testCandidateGenerationExceptionsMnist() throws Exception { - - //Idea: Create a configuration that is not physically realizable, which should throw an exception - // during the candidate generation phase - //This exception should be visible in UI, but training should continue otherwise - - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.05)) - .dropOut(new ContinuousParameterSpace(0.2, 0.7)) - .addLayer( - new ConvolutionLayerSpace.Builder().nIn(1) - .nOut(new IntegerParameterSpace(5, 5)) - .kernelSize(new DiscreteParameterSpace<>(new int[]{14, 14}, new int[]{30, 30})) - .stride(2, 2) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.SOFTPLUS, Activation.LEAKYRELU)) - .build()) - .addLayer(new DenseLayerSpace.Builder().nOut(new IntegerParameterSpace(32, 128)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), new IntegerParameterSpace(0, 1), true) //0 to 1 layers - .addLayer(new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .setInputType(InputType.convolutionalFlat(28, 28, 1)) - .build(); - Map commands = new HashMap<>(); -// commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, commands); - DataProvider dataProvider = new MnistDataSetProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestBasicMnist\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(120, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - IOptimizationRunner runner = - new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - - StatsStorage ss = new InMemoryStatsStorage(); - StatusListener sl = new ArbiterStatusListener(ss); - runner.addListeners(sl); - - UIServer.getInstance().attach(ss); - - runner.execute(); - Thread.sleep(1000_000); - } - - - @Test - @Ignore - public void testCandidateExecutionExceptionsMnist() throws Exception { - //Idea: Create a configuration that will throw an exception in the *execution* stage - // How? let's set wrong nOut - //This exception should be visible in UI, but training should continue otherwise - - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.05)) - .dropOut(new ContinuousParameterSpace(0.2, 0.7)) - .addLayer( - new ConvolutionLayerSpace.Builder().nIn(1) - .nOut(new IntegerParameterSpace(5, 5)) - .kernelSize(new DiscreteParameterSpace<>(new int[]{3, 3}, new int[]{4, 4})) - .stride(2, 2) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.SOFTPLUS, Activation.LEAKYRELU)) - .build()) - .addLayer(new DenseLayerSpace.Builder().nOut(new IntegerParameterSpace(32, 64)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), new IntegerParameterSpace(0, 1), true) //0 to 1 layers - .addLayer(new OutputLayerSpace.Builder().nOut(99).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .setInputType(InputType.convolutionalFlat(28, 28, 1)) - .build(); - Map commands = new HashMap<>(); -// commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, commands); - DataProvider dataProvider = new MnistDataSetProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestBasicMnist\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(120, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - IOptimizationRunner runner = - new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - - StatsStorage ss = new InMemoryStatsStorage(); - StatusListener sl = new ArbiterStatusListener(ss); - runner.addListeners(sl); - - UIServer.getInstance().attach(ss); - - runner.execute(); - Thread.sleep(1000_000); - } - - - @Test - @Ignore - public void testExecutionExceptionMnistCompGraph() throws Exception { - - //Idea: Create a configuration that will throw an exception in the *execution* stage - // How? let's set wrong nOut - //This exception should be visible in UI, but training should continue otherwise - - ComputationGraphSpace cgs = new ComputationGraphSpace.Builder() - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.05)) - .dropOut(new ContinuousParameterSpace(0.2, 0.7)) - .addInputs("in") - .addLayer("0", - new ConvolutionLayerSpace.Builder().nIn(1) - .nOut(new IntegerParameterSpace(5, 30)) - .kernelSize(new DiscreteParameterSpace<>(new int[]{3, 3}, - new int[]{4, 4}, new int[]{5, 5})) - .stride(new DiscreteParameterSpace<>(new int[]{1, 1}, - new int[]{2, 2})) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.SOFTPLUS, Activation.LEAKYRELU)) - .build(), "in") - .addLayer("1", new DenseLayerSpace.Builder().nOut(new IntegerParameterSpace(32, 64)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), "0") - .addLayer("out", new OutputLayerSpace.Builder().nIn(99).nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build(), "1") - .setOutputs("out") - .setInputTypes(InputType.convolutionalFlat(28, 28, 1)) - .build(); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(cgs); - DataProvider dataProvider = new MnistDataSetProvider(); - - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestBasicMnistCG\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataProvider(dataProvider) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(120, TimeUnit.MINUTES), - new MaxCandidatesCondition(100)) - .build(); - - IOptimizationRunner runner = - new LocalOptimizationRunner(configuration, new ComputationGraphTaskCreator()); - - StatsStorage ss = new InMemoryStatsStorage(); - StatusListener sl = new ArbiterStatusListener(ss); - runner.addListeners(sl); - - UIServer.getInstance().attach(ss); - - runner.execute(); - Thread.sleep(1000_000); - } - - - /** - * Visualize multiple optimization sessions run one after another on single-session mode UI - * @throws InterruptedException if current thread has been interrupted - */ - @Test - @Ignore - public void testBasicMnistMultipleSessions() throws InterruptedException { - - MultiLayerSpace mls = new MultiLayerSpace.Builder() - .updater(new SgdSpace(new ContinuousParameterSpace(0.0001, 0.2))) - .l2(new ContinuousParameterSpace(0.0001, 0.05)) - .dropOut(new ContinuousParameterSpace(0.2, 0.7)) - .addLayer( - new ConvolutionLayerSpace.Builder().nIn(1) - .nOut(new IntegerParameterSpace(5, 30)) - .kernelSize(new DiscreteParameterSpace<>(new int[]{3, 3}, - new int[]{4, 4}, new int[]{5, 5})) - .stride(new DiscreteParameterSpace<>(new int[]{1, 1}, - new int[]{2, 2})) - .activation(new DiscreteParameterSpace<>(Activation.RELU, - Activation.SOFTPLUS, Activation.LEAKYRELU)) - .build()) - .addLayer(new DenseLayerSpace.Builder().nOut(new IntegerParameterSpace(32, 128)) - .activation(new DiscreteParameterSpace<>(Activation.RELU, Activation.TANH)) - .build(), new IntegerParameterSpace(0, 1), true) //0 to 1 layers - .addLayer(new OutputLayerSpace.Builder().nOut(10).activation(Activation.SOFTMAX) - .lossFunction(LossFunctions.LossFunction.MCXENT).build()) - .setInputType(InputType.convolutionalFlat(28, 28, 1)) - .build(); - Map commands = new HashMap<>(); -// commands.put(DataSetIteratorFactoryProvider.FACTORY_KEY, TestDataFactoryProviderMnist.class.getCanonicalName()); - - //Define configuration: - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls, commands); - - Class ds = MnistDataSource.class; - Properties dsp = new Properties(); - dsp.setProperty("minibatch", "8"); - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestBasicMnist\\").getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataSource(ds, dsp) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(1, TimeUnit.MINUTES), - new MaxCandidatesCondition(3)) - .build(); - - IOptimizationRunner runner = - new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - - StatsStorage ss = new InMemoryStatsStorage(); - - - StatusListener sl = new ArbiterStatusListener(ss); - runner.addListeners(sl); - - UIServer.getInstance().attach(ss); - runner.execute(); - - - candidateGenerator = new RandomSearchGenerator(mls, commands); - configuration = new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataSource(ds, dsp) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(1, TimeUnit.MINUTES), - new MaxCandidatesCondition(3)) - .build(); - - runner = new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - sl = new ArbiterStatusListener(ss); - runner.addListeners(sl); - - UIServer.getInstance().attach(ss); - - runner.execute(); - - Thread.sleep(1000_000); - } - - /** - * Auto-attach multiple optimization sessions to multi-session mode UI - * @throws IOException if could not connect to the server - */ - @Test - public void testUiMultiSessionAutoAttach() throws IOException { - - //Define configuration: - MultiLayerSpace mls = getMultiLayerSpaceMnist(); - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls); - - Class ds = MnistDataSource.class; - Properties dsp = new Properties(); - dsp.setProperty("minibatch", "8"); - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestMultiSessionAutoAttach\\") - .getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataSource(ds, dsp) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(10, TimeUnit.SECONDS), - new MaxCandidatesCondition(1)) - .build(); - - IOptimizationRunner runner = - new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - - // add 3 different sessions to the same execution - HashMap statsStorageForSession = new HashMap<>(); - for (int i = 0; i < 3; i++) { - StatsStorage ss = new InMemoryStatsStorage(); - @NonNull String sessionId = "sid" + i; - statsStorageForSession.put(sessionId, ss); - StatusListener sl = new ArbiterStatusListener(sessionId, ss); - runner.addListeners(sl); - } - - Function statsStorageProvider = statsStorageForSession::get; - UIServer uIServer = UIServer.getInstance(true, statsStorageProvider); - String serverAddress = uIServer.getAddress(); - - runner.execute(); - - for (String sessionId : statsStorageForSession.keySet()) { - /* - * Visiting /arbiter/:sessionId to auto-attach StatsStorage - */ - String sessionUrl = sessionUrl(uIServer.getAddress(), sessionId); - HttpURLConnection conn = (HttpURLConnection) new URL(sessionUrl).openConnection(); - conn.connect(); - - log.info("Checking auto-attaching Arbiter session at {}", sessionUrl(serverAddress, sessionId)); - assertEquals(HttpResponseStatus.OK.code(), conn.getResponseCode()); - assertTrue(uIServer.isAttached(statsStorageForSession.get(sessionId))); - } - } - - /** - * Attach multiple optimization sessions to multi-session mode UI by manually visiting session URL - * @throws Exception if an error occurred - */ - @Test - @Ignore - public void testUiMultiSessionManualAttach() throws Exception { - Nd4j.setDefaultDataTypes(DataType.FLOAT, DataType.FLOAT); - - //Define configuration: - MultiLayerSpace mls = getMultiLayerSpaceMnist(); - CandidateGenerator candidateGenerator = new RandomSearchGenerator(mls); - - Class ds = MnistDataSource.class; - Properties dsp = new Properties(); - dsp.setProperty("minibatch", "8"); - - String modelSavePath = new File(System.getProperty("java.io.tmpdir"), "ArbiterUiTestBasicMnist\\") - .getAbsolutePath(); - - File f = new File(modelSavePath); - if (f.exists()) - f.delete(); - f.mkdir(); - if (!f.exists()) - throw new RuntimeException(); - - OptimizationConfiguration configuration = - new OptimizationConfiguration.Builder() - .candidateGenerator(candidateGenerator).dataSource(ds, dsp) - .modelSaver(new FileModelSaver(modelSavePath)) - .scoreFunction(new TestSetLossScoreFunction(true)) - .terminationConditions(new MaxTimeCondition(10, TimeUnit.MINUTES), - new MaxCandidatesCondition(10)) - .build(); - - - // parallel execution of multiple optimization sessions - HashMap statsStorageForSession = new HashMap<>(); - for (int i = 0; i < 3; i++) { - String sessionId = "sid" + i; - IOptimizationRunner runner = - new LocalOptimizationRunner(configuration, new MultiLayerNetworkTaskCreator()); - StatsStorage ss = new InMemoryStatsStorage(); - statsStorageForSession.put(sessionId, ss); - StatusListener sl = new ArbiterStatusListener(sessionId, ss); - runner.addListeners(sl); - // Asynchronous execution - new Thread(runner::execute).start(); - } - - Function statsStorageProvider = statsStorageForSession::get; - UIServer uIServer = UIServer.getInstance(true, statsStorageProvider); - String serverAddress = uIServer.getAddress(); - - for (String sessionId : statsStorageForSession.keySet()) { - log.info("Arbiter session can be attached at {}", sessionUrl(serverAddress, sessionId)); - } - - Thread.sleep(1000_000); - } - - - /** - * Get URL for arbiter session on given server address - * @param serverAddress server address, e.g.: http://localhost:9000 - * @param sessionId session ID (will be URL-encoded) - * @return URL - * @throws UnsupportedEncodingException if the character encoding is not supported - */ - private static String sessionUrl(String serverAddress, String sessionId) throws UnsupportedEncodingException { - return String.format("%s/arbiter/%s", serverAddress, URLEncoder.encode(sessionId, "UTF-8")); - } - - private static class MnistDataSetProvider implements DataProvider { - - @Override - public DataSetIterator trainData(Map dataParameters) { - try { - if (dataParameters == null || dataParameters.isEmpty()) { - return new MnistDataSetIterator(64, 10000, false, true, true, 123); - } - if (dataParameters.containsKey("batchsize")) { - int b = (Integer) dataParameters.get("batchsize"); - return new MnistDataSetIterator(b, 10000, false, true, true, 123); - } - return new MnistDataSetIterator(64, 10000, false, true, true, 123); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public DataSetIterator testData(Map dataParameters) { - return trainData(dataParameters); - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - - @Override - public String toString() { - return "MnistDataSetProvider()"; - } - } - - public static class MnistDataSource implements DataSource { - private int minibatch; - - public MnistDataSource() { - - } - - @Override - public void configure(Properties properties) { - this.minibatch = Integer.parseInt(properties.getProperty("minibatch", "16")); - } - - @Override - public Object trainData() { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(minibatch, true, 12345), 3); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public Object testData() { - try { - return new EarlyTerminationDataSetIterator(new MnistDataSetIterator(minibatch, true, 12345), 3); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public Class getDataType() { - return DataSetIterator.class; - } - } - -} diff --git a/contrib/attic/arbiter/arbiter-ui/src/test/resources/logback.xml b/contrib/attic/arbiter/arbiter-ui/src/test/resources/logback.xml deleted file mode 100644 index bc7ffbbb5..000000000 --- a/contrib/attic/arbiter/arbiter-ui/src/test/resources/logback.xml +++ /dev/null @@ -1,55 +0,0 @@ - - - - - - logs/application.log - - %date - [%level] - from %logger in %thread - %n%message%n%xException%n - - - - - - %logger{15} - %message%n%xException{5} - - - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/contrib/attic/arbiter/buildmultiplescalaversions.sh b/contrib/attic/arbiter/buildmultiplescalaversions.sh deleted file mode 100644 index 87482f719..000000000 --- a/contrib/attic/arbiter/buildmultiplescalaversions.sh +++ /dev/null @@ -1,57 +0,0 @@ -#! /bin/bash -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -BASEDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" - -function echoError() { - (>&2 echo "$1") -} - -function scalaError() { - echoError "Changing Scala major version to 2.10 in the build did not change the state of your working copy, is Scala 2.11 still the default ?" - exit 2 -} - -function whatchanged() { - cd "$BASEDIR" - for i in $(git status -s --porcelain -- $(find ./ -mindepth 2 -name pom.xml)|awk '{print $2}'); do - echo "$(dirname $i)" - cd "$BASEDIR" - done -} - -set -eu -./change-scala-versions.sh 2.11 # should be idempotent, this is the default -mvn "$@" -./change-scala-versions.sh 2.10 -if [ -z "$(whatchanged)" ]; then - scalaError; -else - if [[ "${@#-pl}" = "$@" ]]; then - mvn -Dmaven.clean.skip=true -pl $(whatchanged| tr '\n' ',') -amd "$@" - else - # the arguments already tweak the project list ! don't tweak them more - # as this can lead to conflicts (excluding a project that's not part of - # the reactor) - mvn "$@" - fi -fi -./change-scala-versions.sh 2.11 # back to the default diff --git a/contrib/attic/arbiter/pom.xml b/contrib/attic/arbiter/pom.xml deleted file mode 100644 index 0c0a54801..000000000 --- a/contrib/attic/arbiter/pom.xml +++ /dev/null @@ -1,288 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j - 1.0.0-SNAPSHOT - - - org.deeplearning4j - arbiter - pom - - Arbiter - Model Evaluation and Testing - - - arbiter-deeplearning4j - arbiter-core - arbiter-server - arbiter-ui - - - - - - org.apache.commons - commons-lang3 - ${commons.lang.version} - - - org.apache.commons - commons-math3 - ${commons.math.version} - - - org.slf4j - slf4j-api - ${slf4j.version} - - - joda-time - joda-time - ${jodatime.version} - - - com.beust - jcommander - ${jcommander.version} - - - com.google.code.gson - gson - ${gson.version} - - - org.nd4j - nd4j-api - ${nd4j.version} - - - org.deeplearning4j - deeplearning4j-core - ${dl4j.version} - - - org.deeplearning4j - deeplearning4j-ui - ${dl4j.version} - - - - org.nd4j - jackson - ${nd4j.version} - - - org.nd4j - guava - ${nd4j.version} - - - - - - - org.projectlombok - lombok - ${lombok.version} - provided - - - - junit - junit - ${junit.version} - test - - - ch.qos.logback - logback-classic - ${logback.version} - test - - - org.deeplearning4j - deeplearning4j-common-tests - ${dl4j.version} - test - - - - - - - org.apache.maven.wagon - wagon-http - 2.9 - - - - - - net.revelc.code.formatter - formatter-maven-plugin - 2.12.1 - - - arbiter-deeplearning4j - arbiter-core - arbiter-server - arbiter-ui - - - - - pl.project13.maven - git-commit-id-plugin - - - org.codehaus.mojo - build-helper-maven-plugin - - - - - - - org.apache.maven.plugins - maven-enforcer-plugin - ${maven-enforcer-plugin.version} - - - test - enforce-test-resources - - enforce - - - ${skipTestResourceEnforcement} - - - test-nd4j-native,test-nd4j-cuda-11.0 - false - - - true - - - - - - org.apache.maven.plugins - maven-surefire-plugin - ${maven-surefire-plugin.version} - - -Ddtype=double -Dfile.encoding=UTF-8 -Xmx3024m -Xms3024m - - true - false - - - - net.alchim31.maven - scala-maven-plugin - ${maven-scala-plugin.version} - - - -deprecation - -explaintypes - -nobootcp - - - - - scala-compile-first - process-resources - - add-source - compile - - - - scala-test-compile - process-test-resources - - add-source - testCompile - - - - - - org.eclipse.m2e - lifecycle-mapping - - - - - - - - test-nd4j-native - - - org.nd4j - nd4j-native - ${nd4j.version} - test - - - org.deeplearning4j - dl4j-test-resources - ${nd4j.version} - test - - - - - test-nd4j-cuda-11.0 - - - org.nd4j - nd4j-cuda-11.0 - ${nd4j.version} - test - - - org.deeplearning4j - dl4j-test-resources - ${nd4j.version} - test - - - - - diff --git a/contrib/attic/datavec-hadoop/pom.xml b/contrib/attic/datavec-hadoop/pom.xml deleted file mode 100644 index f8539b55c..000000000 --- a/contrib/attic/datavec-hadoop/pom.xml +++ /dev/null @@ -1,76 +0,0 @@ - - - - - - 4.0.0 - - - org.datavec - datavec-data - 1.0.0-SNAPSHOT - - - datavec-hadoop - - - - org.datavec - datavec-api - - - io.netty - netty - ${netty.version} - - - org.apache.hadoop - hadoop-common - ${hadoop.version} - provided - - - com.google.code.findbugs - jsr305 - - - jdk.tools - jdk.tools - - - org.slf4j - slf4j-log4j12 - - - - - - - - test-nd4j-native - - - test-nd4j-cuda-11.0 - - - diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/conf/ConfigurationUtil.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/conf/ConfigurationUtil.java deleted file mode 100644 index b48dcb14c..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/conf/ConfigurationUtil.java +++ /dev/null @@ -1,60 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.conf; - -import org.apache.hadoop.conf.Configuration; -import org.apache.hadoop.fs.Path; - -/** - * Notes - * - * https://linuxjunkies.wordpress.com/2011/11/21/a-hdfsclient-for-hadoop-using-the-native-java-api-a-tutorial/ - * - * Design Ideas - * - * - Need a DataVec Conf entry: - * - hadoop.configuration.path - * - example: hadoop.configuration.path=/home/hadoop/hadoop/conf/ - * - * - * @author josh - * - */ -public class ConfigurationUtil { - - public static Configuration generateConfig(String baseConfPath) { - - String baseConfPathTrimmed = baseConfPath.trim(); - - if (false == "/".equals(baseConfPathTrimmed.endsWith("/"))) { - - baseConfPathTrimmed += "/"; - - } - - Configuration conf = new Configuration(); - conf.addResource(new Path(baseConfPathTrimmed + "core-site.xml")); - conf.addResource(new Path(baseConfPathTrimmed + "hdfs-site.xml")); - conf.addResource(new Path(baseConfPathTrimmed + "mapred-site.xml")); - - return conf; - - } - -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/IndexToKey.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/IndexToKey.java deleted file mode 100644 index f197599d3..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/IndexToKey.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader.mapfile; - -import org.apache.hadoop.io.MapFile; -import org.apache.hadoop.io.Writable; -import org.apache.hadoop.io.WritableComparable; -import org.nd4j.common.primitives.Pair; - -import java.io.IOException; -import java.util.List; - -/** - * An interface to handle Index to key conversion, for use in {@link MapFileReader} - * - * @author Alex Black - */ -public interface IndexToKey { - - /** - * Initialise the instance, and return the first and last record indexes (inclusive) for each reader - * - * @param readers The underlying map file readers - */ - List> initialize(MapFile.Reader[] readers, Class valueClass) - throws IOException; - - /** - * Get the key for the given index - * - * @param index 0 to getNumRecords(reader) - * @return The key for the given index - */ - WritableComparable getKeyForIndex(long index); - - /** - * Getter infer the number of records in the given map file(s) - * - * @return Number of records in the map file(s) - */ - long getNumRecords() throws IOException; - -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileReader.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileReader.java deleted file mode 100644 index ed2db1629..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileReader.java +++ /dev/null @@ -1,140 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader.mapfile; - -import org.apache.hadoop.conf.Configuration; -import org.apache.hadoop.fs.Path; -import org.apache.hadoop.io.MapFile; -import org.apache.hadoop.io.SequenceFile; -import org.apache.hadoop.io.Writable; -import org.apache.hadoop.io.WritableComparable; -import org.apache.hadoop.util.ReflectionUtils; -import org.datavec.hadoop.records.reader.mapfile.index.LongIndexToKey; -import org.datavec.hadoop.records.reader.mapfile.record.RecordWritable; -import org.nd4j.common.primitives.Pair; - -import java.io.Closeable; -import java.io.IOException; -import java.util.Collections; -import java.util.List; - -/** - * A wrapper around a Hadoop {@link MapFile.Reader}, used in {@link MapFileRecordReader} and {@link MapFileSequenceRecordReader} - * - * Note: This also handles multiple map files, such as the output from Spark, which gives a set of map files - * in directories like /part-r-00000, /part-r-00001 - * - * @author Alex Black - */ -public class MapFileReader implements Closeable { - - private MapFile.Reader[] readers; - private IndexToKey indexToKey; - private Class recordClass; - private List> recordIndexesEachReader; - private Long numRecords; - - - public MapFileReader(String path) throws Exception { - this(path, new LongIndexToKey(), RecordWritable.class); - } - - /** - * @param path Path (directory) of the MapFile - * @param indexToKey Instance used to convert long indices to key values. This allows for lookup by key - * @param recordClass Class of the records in the MapFile - * @throws IOException If an error occurs during opening or initialisation - */ - public MapFileReader(String path, IndexToKey indexToKey, Class recordClass) throws IOException { - this(Collections.singletonList(path), indexToKey, recordClass); - } - - public MapFileReader(List paths, IndexToKey indexToKey, Class recordClass) - throws IOException { - - this.indexToKey = indexToKey; - this.recordClass = recordClass; - this.readers = new MapFile.Reader[paths.size()]; - - SequenceFile.Reader.Option[] opts = new SequenceFile.Reader.Option[0]; - - Configuration config = new Configuration(); - for (int i = 0; i < paths.size(); i++) { - readers[i] = new MapFile.Reader(new Path(paths.get(i)), config, opts); - if (readers[i].getValueClass() != recordClass) { - throw new UnsupportedOperationException("MapFile record class: " + readers[i].getValueClass() - + ", but got class " + recordClass + ", path = " + paths.get(i)); - } - } - - recordIndexesEachReader = indexToKey.initialize(readers, recordClass); - } - - /** - * Determine the total number of records in the map file, using the {@link IndexToKey} instance - * - * @return Total number of records and the map file - */ - public long numRecords() { - if (numRecords == null) { - try { - numRecords = indexToKey.getNumRecords(); - } catch (IOException e) { - throw new RuntimeException(e); - } - } - return numRecords; - } - - /** - * It a single record from the map file for the given index - * - * @param index Index, between 0 and numRecords()-1 - * @return Value from the MapFile - * @throws IOException If an error occurs during reading - */ - public V getRecord(long index) throws IOException { - //First: determine which reader to read from... - int readerIdx = -1; - for (int i = 0; i < recordIndexesEachReader.size(); i++) { - Pair p = recordIndexesEachReader.get(i); - if (index >= p.getFirst() && index <= p.getSecond()) { - readerIdx = i; - break; - } - } - if (readerIdx == -1) { - throw new IllegalStateException("Index not found in any reader: " + index); - } - - WritableComparable key = indexToKey.getKeyForIndex(index); - Writable value = ReflectionUtils.newInstance(recordClass, null); - - V v = (V) readers[readerIdx].get(key, value); - return v; - } - - - @Override - public void close() throws IOException { - for (MapFile.Reader r : readers) { - r.close(); - } - } -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileRecordReader.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileRecordReader.java deleted file mode 100644 index df17f5e68..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileRecordReader.java +++ /dev/null @@ -1,299 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader.mapfile; - -import org.datavec.api.conf.Configuration; -import org.datavec.api.records.Record; -import org.datavec.api.records.listener.RecordListener; -import org.datavec.api.records.metadata.RecordMetaData; -import org.datavec.api.records.metadata.RecordMetaDataIndex; -import org.datavec.api.records.reader.RecordReader; -import org.datavec.api.split.InputSplit; -import org.datavec.api.writable.Writable; -import org.datavec.hadoop.records.reader.mapfile.index.LongIndexToKey; -import org.datavec.hadoop.records.reader.mapfile.record.RecordWritable; -import org.nd4j.common.util.MathUtils; - -import java.io.DataInputStream; -import java.io.File; -import java.io.IOException; -import java.net.URI; -import java.util.*; - -/** - * A {@link RecordReader} implementation for reading from a Hadoop {@link org.apache.hadoop.io.MapFile}
- *

- * A typical use case is with {@link org.datavec.api.transform.TransformProcess} executed on Spark (perhaps Spark - * local), followed by non-distributed training on a single machine. For example: - *

- *  {@code
- *  JavaRDD> myRDD = ...;
- *  String mapFilePath = ...;
- *  SparkStorageUtils.saveMapFile( mapFilePath, myRDD );
- *
- *  RecordReader rr = new MapFileRecordReader();
- *  rr.initialize( new FileSplit( new File( mapFilePath ) ) );
- *  //Pass to DataSetIterator or similar
- *  }
- * 
- * - * Alternatively, use {@link org.datavec.hadoop.records.writer.mapfile.MapFileRecordWriter}.
- * Note that this record reader supports optional randomisation of order. - * - * @author Alex Black - */ -public class MapFileRecordReader implements RecordReader { - private static final Class recordClass = RecordWritable.class; - - private final IndexToKey indexToKey; - private MapFileReader mapFileReader; - private URI baseDirUri; - private List listeners; - - private long numRecords; - private long position; - private Random rng; - private int[] order; - - /** - * Create a MapFileRecordReader with no randomisation, and assuming MapFile keys are {@link org.apache.hadoop.io.LongWritable} - * values - */ - public MapFileRecordReader() throws Exception { - this(new LongIndexToKey(), null); - } - - /** - * Create a MapFileRecordReader with optional randomisation, and assuming MapFile keys are - * {@link org.apache.hadoop.io.LongWritable} values - * - * @param rng If non-null, will be used to randomize the order of examples - * - */ - public MapFileRecordReader(Random rng) { - this(new LongIndexToKey(), rng); - } - - /** - * Create a MapFileRecordReader with optional randomisation, with a custom {@link IndexToKey} instance to - * handle MapFile keys - * - * @param indexToKey Handles conversion between long indices and key values (see for example {@link LongIndexToKey} - * @param rng If non-null, will be used to randomize the order of examples - * - */ - public MapFileRecordReader(IndexToKey indexToKey, Random rng) { - this.indexToKey = indexToKey; - this.rng = rng; - } - - @Override - public void initialize(InputSplit split) throws IOException, InterruptedException { - initialize(null, split); - } - - @Override - public void initialize(Configuration conf, InputSplit split) throws IOException, InterruptedException { - URI[] uris = split.locations(); - - //First: work out whether we have a single MapFile or multiple parts - int dataCount = 0; - int indexCount = 0; - List dataUris = new ArrayList<>(); - for (URI u : uris) { - String p = u.getPath(); - if (p.endsWith("data")) { - dataCount++; - dataUris.add(u); - } else if (p.endsWith("index")) { - indexCount++; - } - } - - //Check URIs are correct: we expect one or more /data and /index files... - if (dataCount == 0 || indexCount == 0) { - throw new IllegalStateException("Cannot initialize MapFileSequenceRecordReader: could not find data and " - + "index files in input split"); - } - if (dataCount != indexCount) { - throw new IllegalStateException("Invalid input: found " + dataCount + " data files but " + indexCount - + " index files. Expect equal number of both for map files"); - } - - List mapFilePartRootDirectories = new ArrayList<>(dataUris.size()); - for (URI u : dataUris) { - File partRootDir = new File(u).getParentFile(); - mapFilePartRootDirectories.add(partRootDir.getAbsolutePath()); - } - - //Sort the paths so we iterate over multi-part MapFiles like part-r-00000, part-r-00001, etc when not randomized - Collections.sort(mapFilePartRootDirectories); - - - if (dataUris.size() == 1) { - //Just parent of /data - baseDirUri = new File(dataUris.get(0)).getParentFile().toURI(); - } else { - //Multiple parts -> up 2 levels from data - //so, /baseDir/part-r-00000/data -> /baseDir - baseDirUri = new File(dataUris.get(0)).getParentFile().getParentFile().toURI(); - } - - if (mapFileReader != null) { - mapFileReader.close(); - } - - this.mapFileReader = new MapFileReader<>(mapFilePartRootDirectories, indexToKey, recordClass); - this.numRecords = mapFileReader.numRecords(); - - if (rng != null) { - order = new int[(int) numRecords]; - for (int i = 0; i < order.length; i++) { - order[i] = i; - } - MathUtils.shuffleArray(order, rng); - } - } - - @Override - public void setConf(Configuration conf) { - - } - - @Override - public Configuration getConf() { - return null; - } - - @Override - public boolean batchesSupported() { - return false; - } - - @Override - public List> next(int num) { - throw new UnsupportedOperationException(); - } - - @Override - public List next() { - return next(false).getRecord(); - } - - @Override - public boolean hasNext() { - return position < numRecords; - } - - @Override - public List getLabels() { - return null; - } - - @Override - public void reset() { - position = 0; - if (order != null) { - MathUtils.shuffleArray(order, rng); - } - } - - @Override - public boolean resetSupported() { - return true; - } - - @Override - public List record(URI uri, DataInputStream dataInputStream) throws IOException { - throw new UnsupportedOperationException(); - } - - @Override - public Record nextRecord() { - return next(true); - } - - @Override - public Record loadFromMetaData(RecordMetaData recordMetaData) throws IOException { - throw new UnsupportedOperationException(); - } - - @Override - public List loadFromMetaData(List recordMetaDatas) throws IOException { - throw new UnsupportedOperationException(); - } - - @Override - public List getListeners() { - return listeners; - } - - @Override - public void setListeners(RecordListener... listeners) { - this.listeners = Arrays.asList(listeners); - } - - @Override - public void setListeners(Collection listeners) { - this.listeners = new ArrayList<>(listeners); - } - - @Override - public void close() throws IOException { - if (mapFileReader != null) { - mapFileReader.close(); - } - } - - - private Record next(boolean withMetadata) { - if (!hasNext()) { - throw new NoSuchElementException(); - } - - RecordWritable rec; - long currIdx; - if (order != null) { - currIdx = order[(int) position++]; - } else { - currIdx = position++; - } - - try { - rec = mapFileReader.getRecord(currIdx); - } catch (IOException e) { - throw new RuntimeException(e); - } - - RecordMetaData meta; - if (withMetadata) { - meta = new RecordMetaDataIndex(currIdx, baseDirUri, MapFileRecordReader.class); - } else { - meta = null; - } - - if (listeners != null && !listeners.isEmpty()) { - for (RecordListener l : listeners) { - l.recordRead(this, rec); - } - } - - return new org.datavec.api.records.impl.Record(rec.getRecord(), meta); - } -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileSequenceRecordReader.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileSequenceRecordReader.java deleted file mode 100644 index 61f7ac2b1..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/MapFileSequenceRecordReader.java +++ /dev/null @@ -1,332 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader.mapfile; - -import lombok.NonNull; -import org.datavec.api.conf.Configuration; -import org.datavec.api.records.Record; -import org.datavec.api.records.SequenceRecord; -import org.datavec.api.records.listener.RecordListener; -import org.datavec.api.records.metadata.RecordMetaData; -import org.datavec.api.records.metadata.RecordMetaDataIndex; -import org.datavec.api.records.reader.SequenceRecordReader; -import org.datavec.api.split.InputSplit; -import org.datavec.api.writable.Writable; -import org.datavec.hadoop.records.reader.mapfile.index.LongIndexToKey; -import org.datavec.hadoop.records.reader.mapfile.record.SequenceRecordWritable; -import org.nd4j.common.util.MathUtils; - -import java.io.DataInputStream; -import java.io.File; -import java.io.IOException; -import java.net.URI; -import java.util.*; - -/** - * A {@link SequenceRecordReader} implementation for reading from a Hadoop {@link org.apache.hadoop.io.MapFile}
- *

- * A typical use case is with {@link org.datavec.api.transform.TransformProcess} executed on Spark (perhaps Spark - * local), followed by non-distributed training on a single machine. For example: - *

- *  {@code
- *  JavaRDD>> myRDD = ...;
- *  String mapFilePath = ...;
- *  SparkStorageUtils.saveMapFileSequences( mapFilePath, myRDD );
- *
- *  SequenceRecordReader rr = new MapFileSequenceRecordReader();
- *  rr.initialize( new FileSplit( new File( mapFilePath ) ) );
- *  //Pass to DataSetIterator or similar
- *  }
- * 
- * - * Alternatively, use {@link org.datavec.hadoop.records.writer.mapfile.MapFileSequenceRecordWriter}.
- * Note that this sequence record reader supports optional randomisation of order. - * - * @author Alex Black - */ -public class MapFileSequenceRecordReader implements SequenceRecordReader { - private static final Class recordClass = SequenceRecordWritable.class; - - private final IndexToKey indexToKey; - private MapFileReader mapFileReader; - private URI baseDirUri; - private List listeners; - - private long numSequences; - private long position; - private Random rng; - private int[] order; - - /** - * Create a MapFileSequenceRecordReader with no randomisation, and assuming MapFile keys are {@link org.apache.hadoop.io.LongWritable} - * values - */ - public MapFileSequenceRecordReader() { - this(new LongIndexToKey(), null); - } - - /** - * Create a MapFileSequenceRecordReader with optional randomisation, and assuming MapFile keys are - * {@link org.apache.hadoop.io.LongWritable} values - * - * @param rng If non-null, will be used to randomize the order of examples - * - */ - public MapFileSequenceRecordReader(Random rng) { - this(new LongIndexToKey(), rng); - } - - /** - * Create a MapFileSequenceRecordReader with optional randomisation, with a custom {@link IndexToKey} instance to - * handle MapFile keys - * - * @param indexToKey Handles conversion between long indices and key values (see for example {@link LongIndexToKey} - * @param rng If non-null, will be used to randomize the order of examples - * - */ - public MapFileSequenceRecordReader(IndexToKey indexToKey, Random rng) { - this.indexToKey = indexToKey; - this.rng = rng; - } - - @Override - public void initialize(InputSplit split) throws IOException, InterruptedException { - initialize(null, split); - } - - @Override - public void initialize(Configuration conf, InputSplit split) throws IOException, InterruptedException { - URI[] uris = split.locations(); - - //First: work out whether we have a single MapFile or multiple parts - int dataCount = 0; - int indexCount = 0; - List dataUris = new ArrayList<>(); - for (URI u : uris) { - String p = u.getPath(); - if (p.endsWith("data")) { - dataCount++; - dataUris.add(u); - } else if (p.endsWith("index")) { - indexCount++; - } - } - - //Check URIs are correct: we expect one or more /data and /index files... - if (dataCount == 0 || indexCount == 0) { - throw new IllegalStateException("Cannot initialize MapFileSequenceRecordReader: could not find data and " - + "index files in input split"); - } - if (dataCount != indexCount) { - throw new IllegalStateException("Invalid input: found " + dataCount + " data files but " + indexCount - + " index files. Expect equal number of both for map files"); - } - - List mapFilePartRootDirectories = new ArrayList<>(dataUris.size()); - for (URI u : dataUris) { - File partRootDir = new File(u).getParentFile(); - mapFilePartRootDirectories.add(partRootDir.getAbsolutePath()); - } - - //Sort the paths so we iterate over multi-part MapFiles like part-r-00000, part-r-00001, etc when not randomized - Collections.sort(mapFilePartRootDirectories); - - - if (dataUris.size() == 1) { - //Just parent of /data - baseDirUri = new File(dataUris.get(0)).getParentFile().toURI(); - } else { - //Multiple parts -> up 2 levels from data - //so, /baseDir/part-r-00000/data -> /baseDir - baseDirUri = new File(dataUris.get(0)).getParentFile().getParentFile().toURI(); - } - - if (mapFileReader != null) { - mapFileReader.close(); - } - - this.mapFileReader = new MapFileReader<>(mapFilePartRootDirectories, indexToKey, recordClass); - this.numSequences = mapFileReader.numRecords(); - - if (rng != null) { - order = new int[(int) numSequences]; - for (int i = 0; i < order.length; i++) { - order[i] = i; - } - MathUtils.shuffleArray(order, rng); - } - } - - @Override - public void setConf(Configuration conf) { - - } - - @Override - public Configuration getConf() { - return null; - } - - @Override - public List> sequenceRecord() { - return nextSequence(false).getSequenceRecord(); - } - - @Override - public List> sequenceRecord(URI uri, DataInputStream dataInputStream) throws IOException { - throw new UnsupportedOperationException("MapFileSequenceRecordReader: does not support reading from streams"); - } - - @Override - public SequenceRecord nextSequence() { - return nextSequence(true); - } - - private SequenceRecord nextSequence(boolean withMetadata) { - if (!hasNext()) { - throw new NoSuchElementException(); - } - - SequenceRecordWritable seq; - long currIdx; - if (order != null) { - currIdx = order[(int) position++]; - } else { - currIdx = position++; - } - - try { - seq = mapFileReader.getRecord(currIdx); - } catch (IOException e) { - throw new RuntimeException(e); - } - - RecordMetaData meta; - if (withMetadata) { - meta = new RecordMetaDataIndex(currIdx, baseDirUri, MapFileSequenceRecordReader.class); - } else { - meta = null; - } - - if (listeners != null && !listeners.isEmpty()) { - for (RecordListener l : listeners) { - l.recordRead(this, seq); - } - } - - return new org.datavec.api.records.impl.SequenceRecord(seq.getSequenceRecord(), meta); - } - - @Override - public SequenceRecord loadSequenceFromMetaData(@NonNull RecordMetaData recordMetaData) throws IOException { - long idx = ((RecordMetaDataIndex) recordMetaData).getIndex(); - return new org.datavec.api.records.impl.SequenceRecord(mapFileReader.getRecord(idx).getSequenceRecord(), - recordMetaData); - } - - @Override - public List loadSequenceFromMetaData(@NonNull List recordMetaDatas) - throws IOException { - List out = new ArrayList<>(recordMetaDatas.size()); - for (RecordMetaData r : recordMetaDatas) { - out.add(loadSequenceFromMetaData(r)); - } - return out; - } - - @Override - public boolean batchesSupported() { - return false; - } - - @Override - public List> next(int num) { - throw new UnsupportedOperationException(); - } - - @Override - public List next() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean hasNext() { - return position < numSequences; - } - - @Override - public List getLabels() { - return null; - } - - @Override - public void reset() { - position = 0; - if (order != null) { - MathUtils.shuffleArray(order, rng); - } - } - - @Override - public boolean resetSupported() { - return true; - } - - @Override - public List record(URI uri, DataInputStream dataInputStream) throws IOException { - throw new UnsupportedOperationException(); - } - - @Override - public Record nextRecord() { - throw new UnsupportedOperationException(); - } - - @Override - public Record loadFromMetaData(RecordMetaData recordMetaData) throws IOException { - throw new UnsupportedOperationException(); - } - - @Override - public List loadFromMetaData(List recordMetaDatas) throws IOException { - throw new UnsupportedOperationException(); - } - - @Override - public List getListeners() { - return listeners; - } - - @Override - public void setListeners(RecordListener... listeners) { - this.listeners = Arrays.asList(listeners); - } - - @Override - public void setListeners(Collection listeners) { - this.listeners = new ArrayList<>(listeners); - } - - @Override - public void close() throws IOException { - if (mapFileReader != null) { - mapFileReader.close(); - } - } -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/index/LongIndexToKey.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/index/LongIndexToKey.java deleted file mode 100644 index 33508fd89..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/index/LongIndexToKey.java +++ /dev/null @@ -1,134 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader.mapfile.index; - -import org.apache.hadoop.io.LongWritable; -import org.apache.hadoop.io.MapFile; -import org.apache.hadoop.io.Writable; -import org.apache.hadoop.util.ReflectionUtils; -import org.datavec.hadoop.records.reader.mapfile.IndexToKey; -import org.nd4j.common.primitives.Pair; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.Collections; -import java.util.Comparator; -import java.util.List; - -/** - * A default implementation of {@link IndexToKey} that assumes (strictly requires) keys that are - * {@link LongWritable} values, where all values are both unique and contiguous (0 to numRecords()-1)
- * This allows for easy inference of the number of records, and identify mapping between indexes and keys. - * - * @author Alex Black - */ -public class LongIndexToKey implements IndexToKey { - - private List> readerIndices; - - @Override - public List> initialize(MapFile.Reader[] readers, Class valueClass) - throws IOException { - - List> l = new ArrayList<>(readers.length); - for (MapFile.Reader r : readers) { - //Get the first and last keys: - long first = -1; - long last = -1; - - //First key: no method for this for some inexplicable reason :/ - LongWritable k = new LongWritable(); - Writable v = ReflectionUtils.newInstance(valueClass, null); - boolean hasNext = r.next(k, v); - if(!hasNext){ - //This map file is empty - no data - l.add(new Pair<>(-1L, -1L)); - continue; - } - first = k.get(); - - //Last key: easy - r.reset(); - r.finalKey(k); - last = k.get(); - - l.add(new Pair<>(first, last)); - } - - //Check that things are actually contiguous: - List> sorted = new ArrayList<>(l.size()); - for(Pair p : l){ - if(p.getLeft() >= 0){ - sorted.add(p); - } - } - Collections.sort(sorted, new Comparator>() { - @Override - public int compare(Pair o1, Pair o2) { - return Long.compare(o1.getFirst(), o2.getFirst()); - } - }); - - if (sorted.size() == 0){ - throw new IllegalStateException("Map file is empty - no data available"); - } - if (sorted.get(0).getFirst() != 0L) { - throw new UnsupportedOperationException("Minimum key value is not 0: got " + sorted.get(0).getFirst()); - } - - for (int i = 0; i < sorted.size() - 1; i++) { - long currLast = sorted.get(i).getSecond(); - long nextFirst = sorted.get(i + 1).getFirst(); - - if(nextFirst == -1){ - //Skip empty map file - continue; - } - - if (currLast + 1 != nextFirst) { - throw new IllegalStateException( - "Keys are not contiguous between readers: first/last indices (inclusive) " + "are " - + sorted - + ".\n LongIndexKey assumes unique and contiguous LongWritable keys"); - } - } - - readerIndices = l; - return readerIndices; - } - - @Override - public LongWritable getKeyForIndex(long index) { - return new LongWritable(index); - } - - @Override - public long getNumRecords() throws IOException { - long max = -1; - for (Pair p : readerIndices) { - max = Math.max(max, p.getSecond()); - } - - if (max <= 0) { - throw new IllegalStateException("Invalid number of keys found: " + max); - } - - return max + 1; //Zero indexed - } -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/record/RecordWritable.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/record/RecordWritable.java deleted file mode 100644 index 448b6504a..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/record/RecordWritable.java +++ /dev/null @@ -1,61 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader.mapfile.record; - -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.NoArgsConstructor; -import org.apache.hadoop.io.Writable; -import org.datavec.api.writable.WritableFactory; - -import java.io.DataInput; -import java.io.DataOutput; -import java.io.IOException; -import java.util.ArrayList; -import java.util.List; - -/** - * Created by Alex on 29/05/2017. - */ -@AllArgsConstructor -@NoArgsConstructor -@Data -public class RecordWritable implements Writable { - private List record; - - @Override - public void write(DataOutput out) throws IOException { - WritableFactory wf = WritableFactory.getInstance(); - out.writeInt(record.size()); - for (org.datavec.api.writable.Writable w : record) { - wf.writeWithType(w, out); - } - } - - @Override - public void readFields(DataInput in) throws IOException { - WritableFactory wf = WritableFactory.getInstance(); - int numRecords = in.readInt(); - - record = new ArrayList<>(numRecords); - for (int i = 0; i < numRecords; i++) { - record.add(wf.readWithType(in)); - } - } -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/record/SequenceRecordWritable.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/record/SequenceRecordWritable.java deleted file mode 100644 index 4a8428cd3..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/reader/mapfile/record/SequenceRecordWritable.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader.mapfile.record; - -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.NoArgsConstructor; -import org.apache.hadoop.io.Writable; -import org.datavec.api.writable.WritableFactory; - -import java.io.DataInput; -import java.io.DataOutput; -import java.io.IOException; -import java.util.ArrayList; -import java.util.Collections; -import java.util.List; - -/** - * Created by Alex on 29/05/2017. - */ -@AllArgsConstructor -@NoArgsConstructor -@Data -public class SequenceRecordWritable implements Writable { - private List> sequenceRecord; - - @Override - public void write(DataOutput out) throws IOException { - WritableFactory wf = WritableFactory.getInstance(); - //Assumption: each step in each record is the same size - out.writeInt(sequenceRecord.size()); - if (sequenceRecord.size() > 0) { - int valuesPerStep = sequenceRecord.get(0).size(); - out.writeInt(valuesPerStep); - - for (List step : sequenceRecord) { - if (step.size() != valuesPerStep) { - throw new IllegalStateException( - "Number of values per time step vary: " + valuesPerStep + " vs. " + step.size()); - } - for (org.datavec.api.writable.Writable w : step) { - wf.writeWithType(w, out); - } - } - } - } - - @Override - public void readFields(DataInput in) throws IOException { - WritableFactory wf = WritableFactory.getInstance(); - int numSteps = in.readInt(); - if (numSteps > 0) { - int valuesPerStep = in.readInt(); - List> out = new ArrayList<>(numSteps); - - for (int i = 0; i < numSteps; i++) { - List currStep = new ArrayList<>(valuesPerStep); - for (int j = 0; j < valuesPerStep; j++) { - currStep.add(wf.readWithType(in)); - } - out.add(currStep); - } - sequenceRecord = out; - } else { - sequenceRecord = Collections.emptyList(); - } - } -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/AbstractMapFileWriter.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/AbstractMapFileWriter.java deleted file mode 100644 index 1b3db33fa..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/AbstractMapFileWriter.java +++ /dev/null @@ -1,282 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.writer.mapfile; - -import lombok.NonNull; -import org.apache.hadoop.fs.Path; -import org.apache.hadoop.io.MapFile; -import org.apache.hadoop.io.SequenceFile; -import org.apache.hadoop.io.WritableComparable; -import org.datavec.api.conf.Configuration; -import org.datavec.api.split.partition.PartitionMetaData; -import org.datavec.api.writable.*; - -import java.io.File; -import java.io.IOException; -import java.util.ArrayList; -import java.util.List; -import java.util.concurrent.atomic.AtomicBoolean; -import java.util.concurrent.atomic.AtomicLong; - -/** - * An abstract class For creating Hadoop map files, that underlies {@link MapFileRecordWriter} and - * {@link MapFileSequenceRecordWriter}. - * - * @author Alex Black - */ -public abstract class AbstractMapFileWriter { - - public static final String DEFAULT_FILENAME_PATTERN = "part-r-%1$05d"; - public static final Class KEY_CLASS = org.apache.hadoop.io.LongWritable.class; - - /** - * Configuration key for the map file interval. - * This is defined in MapFile.Writer.INDEX_INTERVAL but unfortunately that field is private, hence cannot be - * referenced here. - */ - public static final String MAP_FILE_INDEX_INTERVAL_KEY = "io.map.index.interval"; - - public static final int DEFAULT_MAP_FILE_SPLIT_SIZE = -1; - public static final int DEFAULT_INDEX_INTERVAL = 1; - - protected final File outputDir; - protected final int mapFileSplitSize; - protected final WritableType convertTextTo; - protected final int indexInterval; - protected final String filenamePattern; - protected org.apache.hadoop.conf.Configuration hadoopConfiguration; - - protected final AtomicLong counter = new AtomicLong(); - protected final AtomicBoolean isClosed = new AtomicBoolean(); - - protected List outputFiles = new ArrayList<>(); - protected List writers = new ArrayList<>(); - - - - protected SequenceFile.Writer.Option[] opts; - - - /** - * Constructor for all default values. Single output MapFile, no text writable conversion, default index - * interval (1), default naming pattern. - * - * @param outputDir Output directory for the map file(s) - */ - public AbstractMapFileWriter(File outputDir) { - this(outputDir, DEFAULT_MAP_FILE_SPLIT_SIZE); - } - - /** - * - * Constructor for most default values. Specified number of output MapFile s, no text writable conversion, default - * index interval (1), default naming pattern. - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize. - * This can be used to avoid having a single multi gigabyte map file, which may be - * undesirable in some cases (transfer across the network, for example) - */ - public AbstractMapFileWriter(@NonNull File outputDir, int mapFileSplitSize) { - this(outputDir, mapFileSplitSize, null); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - */ - public AbstractMapFileWriter(@NonNull File outputDir, WritableType convertTextTo) { - this(outputDir, DEFAULT_MAP_FILE_SPLIT_SIZE, convertTextTo); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize. - * This can be used to avoid having a single multi gigabyte map file, which may be - * undesirable in some cases (transfer across the network, for example) - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - */ - public AbstractMapFileWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo) { - this(outputDir, mapFileSplitSize, convertTextTo, DEFAULT_INDEX_INTERVAL, new org.apache.hadoop.conf.Configuration()); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize. - * This can be used to avoid having a single multi gigabyte map file, which may be - * undesirable in some cases (transfer across the network, for example) - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - * @param indexInterval Index interval for the Map file. Defaults to 1, which is suitable for most cases - * @param hadoopConfiguration Hadoop configuration. - */ - public AbstractMapFileWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo, - int indexInterval, org.apache.hadoop.conf.Configuration hadoopConfiguration) { - this(outputDir, mapFileSplitSize, convertTextTo, indexInterval, DEFAULT_FILENAME_PATTERN, hadoopConfiguration); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize. - * This can be used to avoid having a single multi gigabyte map file, which may be - * undesirable in some cases (transfer across the network, for example) - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - * @param indexInterval Index interval for the Map file. Defaults to 1, which is suitable for most cases - * @param filenamePattern The naming pattern for the map files. Used with String.format(pattern, int) - * @param hadoopConfiguration Hadoop configuration. - */ - public AbstractMapFileWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo, - int indexInterval, String filenamePattern, - org.apache.hadoop.conf.Configuration hadoopConfiguration) { - if(indexInterval <= 0){ - throw new UnsupportedOperationException("Index interval: must be >= 0 (got: " + indexInterval + ")"); - } - this.outputDir = outputDir; - this.mapFileSplitSize = mapFileSplitSize; - if (convertTextTo == WritableType.Text) { - convertTextTo = null; - } - this.convertTextTo = convertTextTo; - this.indexInterval = indexInterval; - this.filenamePattern = filenamePattern; - - this.hadoopConfiguration = hadoopConfiguration; - if(this.hadoopConfiguration.get(MAP_FILE_INDEX_INTERVAL_KEY) != null){ - this.hadoopConfiguration.set(MAP_FILE_INDEX_INTERVAL_KEY, String.valueOf(indexInterval)); - } - - opts = new SequenceFile.Writer.Option[]{MapFile.Writer.keyClass(KEY_CLASS), - SequenceFile.Writer.valueClass(getValueClass())}; - - } - - protected abstract Class getValueClass(); - - - public void setConf(Configuration conf) { - - } - - - public Configuration getConf() { - return null; - } - - protected abstract org.apache.hadoop.io.Writable getHadoopWritable(T input); - - protected List convertTextWritables(List record) { - List newList; - if (convertTextTo != null) { - newList = new ArrayList<>(record.size()); - for (Writable writable : record) { - Writable newWritable; - if (writable.getType() == WritableType.Text) { - switch (convertTextTo) { - case Byte: - newWritable = new ByteWritable((byte) writable.toInt()); - break; - case Double: - newWritable = new DoubleWritable(writable.toDouble()); - break; - case Float: - newWritable = new FloatWritable(writable.toFloat()); - break; - case Int: - newWritable = new IntWritable(writable.toInt()); - break; - case Long: - newWritable = new org.datavec.api.writable.LongWritable(writable.toLong()); - break; - default: - throw new UnsupportedOperationException("Cannot convert text to: " + convertTextTo); - } - } else { - newWritable = writable; - } - newList.add(newWritable); - } - } else { - newList = record; - } - - return newList; - } - - public PartitionMetaData write(T record) throws IOException { - if (isClosed.get()) { - throw new UnsupportedOperationException("Cannot write to MapFileRecordReader that has already been closed"); - } - - if (counter.get() == 0) { - //Initialize first writer - String filename = String.format(DEFAULT_FILENAME_PATTERN, 0); - outputFiles.add(new File(outputDir, filename)); - writers.add(new MapFile.Writer(hadoopConfiguration, new Path(outputFiles.get(0).getAbsolutePath()), opts)); - } - - long key = counter.getAndIncrement(); - MapFile.Writer w; - if (mapFileSplitSize <= 0) { - w = writers.get(0); - } else { - int splitIdx = (int) (key / mapFileSplitSize); - if (writers.size() <= splitIdx) { - //Initialize new writer - next split - String filename = String.format(DEFAULT_FILENAME_PATTERN, splitIdx); - outputFiles.add(new File(outputDir, filename)); - writers.add(new MapFile.Writer(hadoopConfiguration, new Path(outputFiles.get(splitIdx).getAbsolutePath()), opts)); - } - w = writers.get(splitIdx); - } - - org.apache.hadoop.io.Writable hadoopWritable = getHadoopWritable(record); - - w.append(new org.apache.hadoop.io.LongWritable(key), hadoopWritable); - - return PartitionMetaData.builder().numRecordsUpdated(1).build(); - } - - - public void close() { - try { - for (MapFile.Writer w : writers) { - w.close(); - } - } catch (Exception e) { - throw new RuntimeException(e); - } finally { - isClosed.set(true); - } - } -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/MapFileRecordWriter.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/MapFileRecordWriter.java deleted file mode 100644 index 49bc6a143..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/MapFileRecordWriter.java +++ /dev/null @@ -1,186 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.writer.mapfile; - -import lombok.NonNull; -import org.datavec.api.conf.Configuration; -import org.datavec.api.records.writer.RecordWriter; -import org.datavec.api.split.InputSplit; -import org.datavec.api.split.partition.PartitionMetaData; -import org.datavec.api.split.partition.Partitioner; -import org.datavec.api.writable.Writable; -import org.datavec.api.writable.WritableType; -import org.datavec.hadoop.records.reader.mapfile.record.RecordWritable; - -import java.io.File; -import java.io.IOException; -import java.util.List; - -/** - * MapFileRecordWriter is used to write values to a Hadoop MapFile, that can then be read by: - * {@link org.datavec.hadoop.records.reader.mapfile.MapFileRecordReader} - * - * @author Alex Black - * @see org.datavec.hadoop.records.reader.mapfile.MapFileRecordReader - */ -public class MapFileRecordWriter extends AbstractMapFileWriter> implements RecordWriter { - - /** - * Constructor for all default values. Single output MapFile, no text writable conversion, default index - * interval (1), default naming pattern. - * - * @param outputDir Output directory for the map file(s) - */ - public MapFileRecordWriter(File outputDir) { - super(outputDir); - } - - /** - * - * Constructor for most default values. Specified number of output MapFile s, no text writable conversion, default - * index interval (1), default naming pattern. - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - */ - public MapFileRecordWriter(@NonNull File outputDir, int mapFileSplitSize){ - this(outputDir, mapFileSplitSize, null); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - */ - public MapFileRecordWriter(@NonNull File outputDir, WritableType convertTextTo) { - this(outputDir, DEFAULT_MAP_FILE_SPLIT_SIZE, convertTextTo); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - */ - public MapFileRecordWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo) { - super(outputDir, mapFileSplitSize, convertTextTo); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - * @param hadoopConfiguration Hadoop configuration. - */ - public MapFileRecordWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo, - org.apache.hadoop.conf.Configuration hadoopConfiguration) { - super(outputDir, mapFileSplitSize, convertTextTo, DEFAULT_INDEX_INTERVAL, hadoopConfiguration); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - * @param indexInterval Index interval for the Map file. Defaults to 1, which is suitable for most cases - * @param hadoopConfiguration Hadoop configuration. - */ - public MapFileRecordWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo, - int indexInterval, org.apache.hadoop.conf.Configuration hadoopConfiguration) { - super(outputDir, mapFileSplitSize, convertTextTo, indexInterval, hadoopConfiguration); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - * @param indexInterval Index interval for the Map file. Defaults to 1, which is suitable for most cases - * @param filenamePattern The naming pattern for the map files. Used with String.format(pattern, int) - * @param hadoopConfiguration Hadoop configuration. - */ - public MapFileRecordWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo, - int indexInterval, String filenamePattern, - org.apache.hadoop.conf.Configuration hadoopConfiguration) { - super(outputDir, mapFileSplitSize, convertTextTo, indexInterval, filenamePattern, hadoopConfiguration); - } - - @Override - protected Class getValueClass() { - return RecordWritable.class; - } - - @Override - protected org.apache.hadoop.io.Writable getHadoopWritable(List input) { - if(convertTextTo != null){ - input = convertTextWritables(input); - } - - return new RecordWritable(input); - } - - @Override - public boolean supportsBatch() { - return true; - } - - @Override - public void initialize(InputSplit inputSplit, Partitioner partitioner) throws Exception { - - } - - @Override - public void initialize(Configuration configuration, InputSplit split, Partitioner partitioner) throws Exception { - - } - - @Override - public PartitionMetaData writeBatch(List> batch) throws IOException { - for (List record : batch) { - write(record); - } - return PartitionMetaData.builder().numRecordsUpdated(batch.size()).build(); - } -} diff --git a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/MapFileSequenceRecordWriter.java b/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/MapFileSequenceRecordWriter.java deleted file mode 100644 index bd6079e5c..000000000 --- a/contrib/attic/datavec-hadoop/src/main/java/org/datavec/hadoop/records/writer/mapfile/MapFileSequenceRecordWriter.java +++ /dev/null @@ -1,163 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.writer.mapfile; - -import lombok.NonNull; -import org.datavec.api.records.writer.SequenceRecordWriter; -import org.datavec.api.writable.Writable; -import org.datavec.api.writable.WritableType; -import org.datavec.hadoop.records.reader.mapfile.record.SequenceRecordWritable; - -import java.io.File; -import java.util.ArrayList; -import java.util.List; - -/** - * MapFileSequenceRecordWriter is used to write sequence values to a Hadoop MapFile, that can then be read by: - * {@link org.datavec.hadoop.records.reader.mapfile.MapFileSequenceRecordReader} - * - * @author Alex Black - * @see org.datavec.hadoop.records.reader.mapfile.MapFileSequenceRecordReader - */ -public class MapFileSequenceRecordWriter extends AbstractMapFileWriter>> implements SequenceRecordWriter { - - /** - * Constructor for all default values. Single output MapFile, no text writable conversion, default index - * interval (1), default naming pattern. - * - * @param outputDir Output directory for the map file(s) - */ - public MapFileSequenceRecordWriter(File outputDir) { - super(outputDir); - } - - /** - * - * Constructor for most default values. Specified number of output MapFile s, no text writable conversion, default - * index interval (1), default naming pattern. - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - */ - public MapFileSequenceRecordWriter(@NonNull File outputDir, int mapFileSplitSize){ - this(outputDir, mapFileSplitSize, null); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - */ - public MapFileSequenceRecordWriter(@NonNull File outputDir, WritableType convertTextTo) { - this(outputDir, DEFAULT_MAP_FILE_SPLIT_SIZE, convertTextTo); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - */ - public MapFileSequenceRecordWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo) { - super(outputDir, mapFileSplitSize, convertTextTo); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - * @param hadoopConfiguration Hadoop configuration. - */ - public MapFileSequenceRecordWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo, - org.apache.hadoop.conf.Configuration hadoopConfiguration) { - super(outputDir, mapFileSplitSize, convertTextTo, DEFAULT_INDEX_INTERVAL, hadoopConfiguration); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - * @param indexInterval Index interval for the Map file. Defaults to 1, which is suitable for most cases - * @param hadoopConfiguration Hadoop configuration. - */ - public MapFileSequenceRecordWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo, - int indexInterval, org.apache.hadoop.conf.Configuration hadoopConfiguration) { - super(outputDir, mapFileSplitSize, convertTextTo, indexInterval, hadoopConfiguration); - } - - /** - * - * @param outputDir Output directory for the map file(s) - * @param mapFileSplitSize Split size for the map file: if 0, use a single map file for all output. If > 0, - * multiple map files will be used: each will contain a maximum of mapFileSplitSize - * examples. This can be used to avoid having a single multi gigabyte map file, which may - * be undesirable in some cases (transfer across the network, for example). - * @param convertTextTo If null: Make no changes to Text writable objects. If non-null, Text writable instances - * will be converted to this type. This is useful, when would rather store numerical values - * even if the original record reader produces strings/text. - * @param indexInterval Index interval for the Map file. Defaults to 1, which is suitable for most cases - * @param filenamePattern The naming pattern for the map files. Used with String.format(pattern, int) - * @param hadoopConfiguration Hadoop configuration. - */ - public MapFileSequenceRecordWriter(@NonNull File outputDir, int mapFileSplitSize, WritableType convertTextTo, - int indexInterval, String filenamePattern, - org.apache.hadoop.conf.Configuration hadoopConfiguration) { - super(outputDir, mapFileSplitSize, convertTextTo, indexInterval, filenamePattern, hadoopConfiguration); - } - - @Override - protected Class getValueClass() { - return SequenceRecordWritable.class; - } - - @Override - protected org.apache.hadoop.io.Writable getHadoopWritable(List> input) { - if(convertTextTo != null){ - List> newSeq = new ArrayList<>(input.size()); - for(List l : input){ - newSeq.add(convertTextWritables(l)); - } - input = newSeq; - } - - return new SequenceRecordWritable(input); - } -} diff --git a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/AssertTestsExtendBaseClass.java b/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/AssertTestsExtendBaseClass.java deleted file mode 100644 index d130d07fa..000000000 --- a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/AssertTestsExtendBaseClass.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.datavec.hadoop; - -import lombok.extern.slf4j.Slf4j; -import org.nd4j.common.tests.AbstractAssertTestsClass; -import org.nd4j.common.tests.BaseND4JTest; - -import java.util.*; -/** - * This class checks that all test classes (i.e., anything with one or more methods annotated with @Test) - * extends BaseND4jTest - either directly or indirectly. - * Other than a small set of exceptions, all tests must extend this - * - * @author Alex Black - */ - -@Slf4j -public class AssertTestsExtendBaseClass extends AbstractAssertTestsClass { - - @Override - protected Set> getExclusions() { - //Set of classes that are exclusions to the rule (either run manually or have their own logging + timeouts) - return new HashSet<>(); - } - - @Override - protected String getPackageName() { - return "org.datavec.hadoop"; - } - - @Override - protected Class getBaseClass() { - return BaseND4JTest.class; - } -} diff --git a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/conf/TestConfigurationUtil.java b/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/conf/TestConfigurationUtil.java deleted file mode 100644 index 8b0dade01..000000000 --- a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/conf/TestConfigurationUtil.java +++ /dev/null @@ -1,39 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.conf; - -import org.apache.hadoop.conf.Configuration; -import org.junit.Test; - -public class TestConfigurationUtil { - - @Test - public void testLoadHadoopConfFiles() { - - // this would come from the properties file - String confPath = "src/test/resources/conf/example_conf/"; - - Configuration conf = ConfigurationUtil.generateConfig(confPath); - - System.out.println(" works? " + conf.get("fs.default.name")); - - - } - -} diff --git a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReader.java b/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReader.java deleted file mode 100644 index 16d3b3714..000000000 --- a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReader.java +++ /dev/null @@ -1,249 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader; - -import org.nd4j.common.util.MathUtils; -import org.nd4j.shade.guava.io.Files; -import org.apache.hadoop.conf.Configuration; -import org.apache.hadoop.fs.Path; -import org.apache.hadoop.io.*; -import org.datavec.api.records.reader.RecordReader; -import org.datavec.api.records.reader.SequenceRecordReader; -import org.datavec.api.split.FileSplit; -import org.datavec.api.split.InputSplit; -import org.datavec.api.writable.DoubleWritable; -import org.datavec.api.writable.IntWritable; -import org.datavec.api.writable.NDArrayWritable; -import org.datavec.api.writable.Text; -import org.datavec.hadoop.records.reader.mapfile.MapFileRecordReader; -import org.datavec.hadoop.records.reader.mapfile.MapFileSequenceRecordReader; -import org.datavec.hadoop.records.reader.mapfile.record.RecordWritable; -import org.datavec.hadoop.records.reader.mapfile.record.SequenceRecordWritable; -import org.junit.AfterClass; -import org.junit.BeforeClass; -import org.junit.Test; -import org.nd4j.linalg.factory.Nd4j; - -import java.io.File; -import java.io.IOException; -import java.lang.reflect.Field; -import java.net.URI; -import java.util.*; - -import static org.junit.Assert.*; - -/** - * Created by Alex on 29/05/2017. - */ -public class TestMapFileRecordReader { - - private static File tempDirSeq; - private static File tempDir; - private static Path seqMapFilePath; - private static Path mapFilePath; - private static Map seqMap; - private static Map recordMap; - - @BeforeClass - public static void buildMapFiles() throws IOException { - - //----- Sequence RR setup ----- - - Configuration c = new Configuration(); - Class keyClass = LongWritable.class; - Class valueClass = SequenceRecordWritable.class; - - SequenceFile.Writer.Option[] opts = new SequenceFile.Writer.Option[] {MapFile.Writer.keyClass(keyClass), - SequenceFile.Writer.valueClass(valueClass)}; - - tempDirSeq = Files.createTempDir(); - seqMapFilePath = new Path("file:///" + tempDirSeq.getAbsolutePath()); - - MapFile.Writer writer = new MapFile.Writer(c, seqMapFilePath, opts); - - seqMap = new HashMap<>(); - seqMap.put(new LongWritable(0), new SequenceRecordWritable(Arrays.asList( - Arrays.asList(new Text("zero"), new IntWritable(0), - new DoubleWritable(0), new NDArrayWritable(Nd4j.valueArrayOf(10, 0.0))), - Arrays.asList(new Text("one"), new IntWritable(1), - new DoubleWritable(1.0), new NDArrayWritable(Nd4j.valueArrayOf(10, 1.0))), - Arrays.asList(new Text("two"), new IntWritable(2), - new DoubleWritable(2.0), new NDArrayWritable(Nd4j.valueArrayOf(10, 2.0)))))); - - seqMap.put(new LongWritable(1), new SequenceRecordWritable(Arrays.asList( - Arrays.asList(new Text("Bzero"), new IntWritable(10), - new DoubleWritable(10), new NDArrayWritable(Nd4j.valueArrayOf(10, 10.0))), - Arrays.asList(new Text("Bone"), new IntWritable(11), - new DoubleWritable(11.0), new NDArrayWritable(Nd4j.valueArrayOf(10, 11.0))), - Arrays.asList(new Text("Btwo"), new IntWritable(12), - new DoubleWritable(12.0), new NDArrayWritable(Nd4j.valueArrayOf(10, 12.0)))))); - - seqMap.put(new LongWritable(2), new SequenceRecordWritable(Arrays.asList( - Arrays.asList(new Text("Czero"), new IntWritable(20), - new DoubleWritable(20), new NDArrayWritable(Nd4j.valueArrayOf(10, 20.0))), - Arrays.asList(new Text("Cone"), new IntWritable(21), - new DoubleWritable(21.0), new NDArrayWritable(Nd4j.valueArrayOf(10, 21.0))), - Arrays.asList(new Text("Ctwo"), new IntWritable(22), - new DoubleWritable(22.0), new NDArrayWritable(Nd4j.valueArrayOf(10, 22.0)))))); - - - //Need to write in order - for (int i = 0; i <= 2; i++) { - LongWritable key = new LongWritable(i); - SequenceRecordWritable value = seqMap.get(key); - - writer.append(key, value); - } - writer.close(); - - - //----- Standard RR setup ----- - - valueClass = RecordWritable.class; - - opts = new SequenceFile.Writer.Option[] {MapFile.Writer.keyClass(keyClass), - SequenceFile.Writer.valueClass(valueClass)}; - - tempDir = Files.createTempDir(); - mapFilePath = new Path("file:///" + tempDir.getAbsolutePath()); - - writer = new MapFile.Writer(c, mapFilePath, opts); - - recordMap = new HashMap<>(); - recordMap.put(new LongWritable(0), - new RecordWritable(Arrays.asList(new Text("zero"), - new IntWritable(0), new DoubleWritable(0), - new NDArrayWritable(Nd4j.valueArrayOf(10, 0.0))))); - - recordMap.put(new LongWritable(1), - new RecordWritable(Arrays.asList(new Text("one"), - new IntWritable(11), new DoubleWritable(11.0), - new NDArrayWritable(Nd4j.valueArrayOf(10, 11.0))))); - - recordMap.put(new LongWritable(2), - new RecordWritable(Arrays.asList(new Text("two"), - new IntWritable(22), new DoubleWritable(22.0), - new NDArrayWritable(Nd4j.valueArrayOf(10, 22.0))))); - - - //Need to write in order - for (int i = 0; i <= 2; i++) { - LongWritable key = new LongWritable(i); - RecordWritable value = recordMap.get(key); - - writer.append(key, value); - } - writer.close(); - - } - - @AfterClass - public static void destroyMapFiles() { - tempDirSeq.delete(); - tempDirSeq = null; - seqMapFilePath = null; - seqMap = null; - - tempDir.delete(); - tempDir = null; - mapFilePath = null; - seqMap = null; - } - - @Test - public void testSequenceRecordReader() throws Exception { - SequenceRecordReader seqRR = new MapFileSequenceRecordReader(); - URI uri = seqMapFilePath.toUri(); - InputSplit is = new FileSplit(new File(uri)); - seqRR.initialize(is); - - assertTrue(seqRR.hasNext()); - int count = 0; - while (seqRR.hasNext()) { - List> l = seqRR.sequenceRecord(); - - assertEquals(seqMap.get(new LongWritable(count)).getSequenceRecord(), l); - - count++; - } - assertEquals(seqMap.size(), count); - - seqRR.close(); - - //Try the same thing, but with random order - seqRR = new MapFileSequenceRecordReader(new Random(12345)); - seqRR.initialize(is); - - Field f = MapFileSequenceRecordReader.class.getDeclaredField("order"); - f.setAccessible(true); - int[] order = (int[]) f.get(seqRR); - assertNotNull(order); - int[] expOrder = new int[]{0,1,2}; - MathUtils.shuffleArray(expOrder, new Random(12345)); - assertArrayEquals(expOrder, order); - - count = 0; - while (seqRR.hasNext()) { - List> l = seqRR.sequenceRecord(); - assertEquals(seqMap.get(new LongWritable(expOrder[count])).getSequenceRecord(), l); - count++; - } - } - - @Test - public void testRecordReader() throws Exception { - RecordReader rr = new MapFileRecordReader(); - URI uri = mapFilePath.toUri(); - InputSplit is = new FileSplit(new File(uri)); - rr.initialize(is); - - assertTrue(rr.hasNext()); - int count = 0; - while (rr.hasNext()) { - List l = rr.next(); - - assertEquals(recordMap.get(new LongWritable(count)).getRecord(), l); - - count++; - } - assertEquals(recordMap.size(), count); - - rr.close(); - - //Try the same thing, but with random order - rr = new MapFileRecordReader(new Random(12345)); - rr.initialize(is); - - Field f = MapFileRecordReader.class.getDeclaredField("order"); - f.setAccessible(true); - int[] order = (int[]) f.get(rr); - assertNotNull(order); - - int[] expOrder = new int[]{0,1,2}; - MathUtils.shuffleArray(expOrder, new Random(12345)); - assertArrayEquals(expOrder, order); - - count = 0; - while (rr.hasNext()) { - List l = rr.next(); - assertEquals(recordMap.get(new LongWritable(expOrder[count])).getRecord(), l); - count++; - } - } -} diff --git a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReaderMultipleParts.java b/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReaderMultipleParts.java deleted file mode 100644 index 758711b96..000000000 --- a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReaderMultipleParts.java +++ /dev/null @@ -1,300 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader; - -import org.nd4j.common.primitives.Pair; -import org.nd4j.common.util.MathUtils; -import org.nd4j.shade.guava.io.Files; -import org.apache.hadoop.conf.Configuration; -import org.apache.hadoop.fs.Path; -import org.apache.hadoop.io.*; -import org.datavec.api.records.reader.RecordReader; -import org.datavec.api.records.reader.SequenceRecordReader; -import org.datavec.api.split.FileSplit; -import org.datavec.api.split.InputSplit; -import org.datavec.api.writable.DoubleWritable; -import org.datavec.api.writable.IntWritable; -import org.datavec.api.writable.Text; -import org.datavec.hadoop.records.reader.mapfile.IndexToKey; -import org.datavec.hadoop.records.reader.mapfile.MapFileRecordReader; -import org.datavec.hadoop.records.reader.mapfile.MapFileSequenceRecordReader; -import org.datavec.hadoop.records.reader.mapfile.index.LongIndexToKey; -import org.datavec.hadoop.records.reader.mapfile.record.RecordWritable; -import org.datavec.hadoop.records.reader.mapfile.record.SequenceRecordWritable; -import org.junit.AfterClass; -import org.junit.BeforeClass; -import org.junit.Test; - -import java.io.File; -import java.io.IOException; -import java.lang.reflect.Field; -import java.net.URI; -import java.util.*; - -import static org.junit.Assert.*; - -/** - * Basically the same as TestMapfileRecordReader, but we have multiple parts as per say a Spark save operation - * Paths are like - * /part-r-00000/data - * /part-r-00000/index - * /part-r-00001/data - * /part-r-00001/index - * /part-r-00002/data - * /part-r-00002/index - */ -public class TestMapFileRecordReaderMultipleParts { - - private static File tempDirSeq; - private static File tempDir; - private static Path seqMapFilePath; - private static Path mapFilePath; - private static Map seqMap; - private static Map recordMap; - - @BeforeClass - public static void buildMapFiles() throws IOException { - - //----- Sequence RR setup ----- - - Configuration c = new Configuration(); - Class keyClass = LongWritable.class; - Class valueClass = SequenceRecordWritable.class; - - SequenceFile.Writer.Option[] opts = new SequenceFile.Writer.Option[] {MapFile.Writer.keyClass(keyClass), - SequenceFile.Writer.valueClass(valueClass)}; - - tempDirSeq = Files.createTempDir(); - File[] subdirs = new File[3]; - Path[] paths = new Path[subdirs.length]; - MapFile.Writer[] writers = new MapFile.Writer[subdirs.length]; - for (int i = 0; i < subdirs.length; i++) { - subdirs[i] = new File(tempDirSeq, "part-r-0000" + i); - subdirs[i].mkdir(); - paths[i] = new Path("file:///" + subdirs[i].getAbsolutePath()); - writers[i] = new MapFile.Writer(c, paths[i], opts); - } - seqMapFilePath = new Path("file:///" + tempDirSeq.getAbsolutePath()); - - - - seqMap = new HashMap<>(); - - for (int i = 0; i < 9; i++) { - seqMap.put(new LongWritable(i), new SequenceRecordWritable(Arrays.asList( - Arrays.asList(new Text(i + "-0"), new IntWritable(3 * i), - new DoubleWritable(3 * i)), - Arrays.asList(new Text(i + "-1"), - new IntWritable(3 * i + 1), new DoubleWritable(3 * i + 1.0)), - Arrays.asList(new Text(i + "-2"), - new IntWritable(3 * i + 2), new DoubleWritable(3 * i + 2.0))))); - } - - - //Need to write in order, to different map files separately - for (int i = 0; i < seqMap.size(); i++) { - int mapFileIdx = i / writers.length; - - LongWritable key = new LongWritable(i); - SequenceRecordWritable value = seqMap.get(key); - - writers[mapFileIdx].append(key, value); - } - - for (MapFile.Writer m : writers) { - m.close(); - } - - - //----- Standard RR setup ----- - - valueClass = RecordWritable.class; - - opts = new SequenceFile.Writer.Option[] {MapFile.Writer.keyClass(keyClass), - SequenceFile.Writer.valueClass(valueClass)}; - - tempDir = Files.createTempDir(); - subdirs = new File[3]; - paths = new Path[subdirs.length]; - writers = new MapFile.Writer[subdirs.length]; - for (int i = 0; i < subdirs.length; i++) { - subdirs[i] = new File(tempDir, "part-r-0000" + i); - subdirs[i].mkdir(); - paths[i] = new Path("file:///" + subdirs[i].getAbsolutePath()); - writers[i] = new MapFile.Writer(c, paths[i], opts); - } - mapFilePath = new Path("file:///" + tempDir.getAbsolutePath()); - - recordMap = new HashMap<>(); - for (int i = 0; i < 9; i++) { - recordMap.put(new LongWritable(i), new RecordWritable(Arrays.asList( - new Text(String.valueOf(i)), new IntWritable(i), new DoubleWritable(i)))); - } - - - //Need to write in order - for (int i = 0; i < recordMap.size(); i++) { - int mapFileIdx = i / writers.length; - LongWritable key = new LongWritable(i); - RecordWritable value = recordMap.get(key); - - writers[mapFileIdx].append(key, value); - } - - for (MapFile.Writer m : writers) { - m.close(); - } - - } - - @AfterClass - public static void destroyMapFiles() { - tempDirSeq.delete(); - tempDirSeq = null; - seqMapFilePath = null; - seqMap = null; - - tempDir.delete(); - tempDir = null; - mapFilePath = null; - seqMap = null; - } - - @Test - public void testSequenceRecordReader() throws Exception { - SequenceRecordReader seqRR = new MapFileSequenceRecordReader(); - URI uri = seqMapFilePath.toUri(); - InputSplit is = new FileSplit(new File(uri)); - seqRR.initialize(is); - - //Check number of records calculation - Field f = MapFileSequenceRecordReader.class.getDeclaredField("indexToKey"); - f.setAccessible(true); - IndexToKey itk = (IndexToKey) f.get(seqRR); - assertEquals(seqMap.size(), itk.getNumRecords()); - - //Check indices for each map file - List> expReaderExampleIdxs = new ArrayList<>(); - expReaderExampleIdxs.add(new Pair<>(0L, 2L)); - expReaderExampleIdxs.add(new Pair<>(3L, 5L)); - expReaderExampleIdxs.add(new Pair<>(6L, 8L)); - - f = LongIndexToKey.class.getDeclaredField("readerIndices"); - f.setAccessible(true); - assertEquals(expReaderExampleIdxs, f.get(itk)); - // System.out.println(f.get(itk)); - - //Check standard iteration order (no randomization) - assertTrue(seqRR.hasNext()); - int count = 0; - while (seqRR.hasNext()) { - List> l = seqRR.sequenceRecord(); - - assertEquals(seqMap.get(new LongWritable(count)).getSequenceRecord(), l); - - count++; - } - assertFalse(seqRR.hasNext()); - assertEquals(seqMap.size(), count); - - seqRR.close(); - - //Try the same thing, but with random order - seqRR = new MapFileSequenceRecordReader(new Random(12345)); - seqRR.initialize(is); - - //Check order is defined and as expected - f = MapFileSequenceRecordReader.class.getDeclaredField("order"); - f.setAccessible(true); - int[] order = (int[]) f.get(seqRR); - assertNotNull(order); - - int[] expOrder = new int[9]; - for (int i = 0; i < expOrder.length; i++) { - expOrder[i] = i; - } - MathUtils.shuffleArray(expOrder, new Random(12345)); - assertArrayEquals(expOrder, order); - // System.out.println(Arrays.toString(expOrder)); - - count = 0; - while (seqRR.hasNext()) { - List> l = seqRR.sequenceRecord(); - assertEquals(seqMap.get(new LongWritable(expOrder[count])).getSequenceRecord(), l); - count++; - } - } - - @Test - public void testRecordReaderMultipleParts() throws Exception { - RecordReader rr = new MapFileRecordReader(); - URI uri = mapFilePath.toUri(); - InputSplit is = new FileSplit(new File(uri)); - rr.initialize(is); - - //Check number of records calculation - Field f = MapFileRecordReader.class.getDeclaredField("indexToKey"); - f.setAccessible(true); - IndexToKey itk = (IndexToKey) f.get(rr); - assertEquals(seqMap.size(), itk.getNumRecords()); - - //Check indices for each map file - List> expReaderExampleIdxs = new ArrayList<>(); - expReaderExampleIdxs.add(new Pair<>(0L, 2L)); - expReaderExampleIdxs.add(new Pair<>(3L, 5L)); - expReaderExampleIdxs.add(new Pair<>(6L, 8L)); - - f = LongIndexToKey.class.getDeclaredField("readerIndices"); - f.setAccessible(true); - assertEquals(expReaderExampleIdxs, f.get(itk)); - - assertTrue(rr.hasNext()); - int count = 0; - while (rr.hasNext()) { - List l = rr.next(); - assertEquals(recordMap.get(new LongWritable(count)).getRecord(), l); - count++; - } - assertEquals(recordMap.size(), count); - - rr.close(); - - //Try the same thing, but with random order - rr = new MapFileRecordReader(new Random(12345)); - rr.initialize(is); - - f = MapFileRecordReader.class.getDeclaredField("order"); - f.setAccessible(true); - int[] order = (int[]) f.get(rr); - assertNotNull(order); - int[] expOrder = new int[9]; - for (int i = 0; i < expOrder.length; i++) { - expOrder[i] = i; - } - MathUtils.shuffleArray(expOrder, new Random(12345)); - assertArrayEquals(expOrder, order); - - count = 0; - while (rr.hasNext()) { - List l = rr.next(); - assertEquals(recordMap.get(new LongWritable(expOrder[count])).getRecord(), l); - count++; - } - } -} diff --git a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReaderMultiplePartsSomeEmpty.java b/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReaderMultiplePartsSomeEmpty.java deleted file mode 100644 index f5d6765f9..000000000 --- a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/reader/TestMapFileRecordReaderMultiplePartsSomeEmpty.java +++ /dev/null @@ -1,311 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.reader; - -import org.nd4j.common.primitives.Pair; -import org.nd4j.common.util.MathUtils; -import org.nd4j.shade.guava.io.Files; -import org.apache.hadoop.conf.Configuration; -import org.apache.hadoop.fs.Path; -import org.apache.hadoop.io.*; -import org.datavec.api.records.reader.RecordReader; -import org.datavec.api.records.reader.SequenceRecordReader; -import org.datavec.api.split.FileSplit; -import org.datavec.api.split.InputSplit; -import org.datavec.api.writable.DoubleWritable; -import org.datavec.api.writable.IntWritable; -import org.datavec.api.writable.Text; -import org.datavec.hadoop.records.reader.mapfile.IndexToKey; -import org.datavec.hadoop.records.reader.mapfile.MapFileRecordReader; -import org.datavec.hadoop.records.reader.mapfile.MapFileSequenceRecordReader; -import org.datavec.hadoop.records.reader.mapfile.index.LongIndexToKey; -import org.datavec.hadoop.records.reader.mapfile.record.RecordWritable; -import org.datavec.hadoop.records.reader.mapfile.record.SequenceRecordWritable; -import org.junit.AfterClass; -import org.junit.BeforeClass; -import org.junit.Test; - -import java.io.File; -import java.io.IOException; -import java.lang.reflect.Field; -import java.net.URI; -import java.util.*; - -import static org.junit.Assert.*; - -/** - * Basically the same as TestMapfileRecordReader, but we have multiple parts as per say a Spark save operation - * Paths are like - * /part-r-00000/data - * /part-r-00000/index - * /part-r-00001/data - * /part-r-00001/index - * /part-r-00002/data - * /part-r-00002/index - */ -public class TestMapFileRecordReaderMultiplePartsSomeEmpty { - - private static File tempDirSeq; - private static File tempDir; - private static Path seqMapFilePath; - private static Path mapFilePath; - private static Map seqMap; - private static Map recordMap; - - @BeforeClass - public static void buildMapFiles() throws IOException { - - //----- Sequence RR setup ----- - - Configuration c = new Configuration(); - Class keyClass = LongWritable.class; - Class valueClass = SequenceRecordWritable.class; - - SequenceFile.Writer.Option[] opts = new SequenceFile.Writer.Option[] {MapFile.Writer.keyClass(keyClass), - SequenceFile.Writer.valueClass(valueClass)}; - - tempDirSeq = Files.createTempDir(); - File[] subdirs = new File[3]; - Path[] paths = new Path[subdirs.length]; - MapFile.Writer[] writers = new MapFile.Writer[subdirs.length]; - for (int i = 0; i < subdirs.length; i++) { - subdirs[i] = new File(tempDirSeq, "part-r-0000" + i); - subdirs[i].mkdir(); - paths[i] = new Path("file:///" + subdirs[i].getAbsolutePath()); - writers[i] = new MapFile.Writer(c, paths[i], opts); - } - seqMapFilePath = new Path("file:///" + tempDirSeq.getAbsolutePath()); - - - - seqMap = new HashMap<>(); - - for (int i = 0; i < 6; i++) { - seqMap.put(new LongWritable(i), new SequenceRecordWritable(Arrays.asList( - Arrays.asList(new Text(i + "-0"), new IntWritable(3 * i), - new DoubleWritable(3 * i)), - Arrays.asList(new Text(i + "-1"), - new IntWritable(3 * i + 1), new DoubleWritable(3 * i + 1.0)), - Arrays.asList(new Text(i + "-2"), - new IntWritable(3 * i + 2), new DoubleWritable(3 * i + 2.0))))); - } - - - //Need to write in order, to different map files separately - for (int i = 0; i < seqMap.size(); i++) { - int mapFileIdx; - if(i < 3){ - mapFileIdx = 0; - } else { - mapFileIdx = 2; - } - - LongWritable key = new LongWritable(i); - SequenceRecordWritable value = seqMap.get(key); - - writers[mapFileIdx].append(key, value); - } - - for (MapFile.Writer m : writers) { - m.close(); - } - - - //----- Standard RR setup ----- - - valueClass = RecordWritable.class; - - opts = new SequenceFile.Writer.Option[] {MapFile.Writer.keyClass(keyClass), - SequenceFile.Writer.valueClass(valueClass)}; - - tempDir = Files.createTempDir(); - subdirs = new File[3]; - paths = new Path[subdirs.length]; - writers = new MapFile.Writer[subdirs.length]; - for (int i = 0; i < subdirs.length; i++) { - subdirs[i] = new File(tempDir, "part-r-0000" + i); - subdirs[i].mkdir(); - paths[i] = new Path("file:///" + subdirs[i].getAbsolutePath()); - writers[i] = new MapFile.Writer(c, paths[i], opts); - } - mapFilePath = new Path("file:///" + tempDir.getAbsolutePath()); - - recordMap = new HashMap<>(); - for (int i = 0; i < 6; i++) { - recordMap.put(new LongWritable(i), new RecordWritable(Arrays.asList( - new Text(String.valueOf(i)), new IntWritable(i), new DoubleWritable(i)))); - } - - - //Need to write in order - for (int i = 0; i < recordMap.size(); i++) { - int mapFileIdx; - if(i < 3){ - mapFileIdx = 0; - } else { - mapFileIdx = 2; - } - - LongWritable key = new LongWritable(i); - RecordWritable value = recordMap.get(key); - - writers[mapFileIdx].append(key, value); - } - - for (MapFile.Writer m : writers) { - m.close(); - } - - } - - @AfterClass - public static void destroyMapFiles() { - tempDirSeq.delete(); - tempDirSeq = null; - seqMapFilePath = null; - seqMap = null; - -// tempDir.delete(); -// tempDir = null; -// mapFilePath = null; -// seqMap = null; - } - - @Test - public void testSequenceRecordReader() throws Exception { - SequenceRecordReader seqRR = new MapFileSequenceRecordReader(); - URI uri = seqMapFilePath.toUri(); - InputSplit is = new FileSplit(new File(uri)); - seqRR.initialize(is); - - //Check number of records calculation - Field f = MapFileSequenceRecordReader.class.getDeclaredField("indexToKey"); - f.setAccessible(true); - IndexToKey itk = (IndexToKey) f.get(seqRR); - assertEquals(seqMap.size(), itk.getNumRecords()); - - //Check indices for each map file - List> expReaderExampleIdxs = new ArrayList<>(); - expReaderExampleIdxs.add(new Pair<>(0L, 2L)); - expReaderExampleIdxs.add(new Pair<>(-1L, -1L)); - expReaderExampleIdxs.add(new Pair<>(3L, 5L)); - - f = LongIndexToKey.class.getDeclaredField("readerIndices"); - f.setAccessible(true); - assertEquals(expReaderExampleIdxs, f.get(itk)); - // System.out.println(f.get(itk)); - - //Check standard iteration order (no randomization) - assertTrue(seqRR.hasNext()); - int count = 0; - while (seqRR.hasNext()) { - List> l = seqRR.sequenceRecord(); - - assertEquals(seqMap.get(new LongWritable(count)).getSequenceRecord(), l); - - count++; - } - assertFalse(seqRR.hasNext()); - assertEquals(seqMap.size(), count); - - seqRR.close(); - - //Try the same thing, but with random order - seqRR = new MapFileSequenceRecordReader(new Random(12345)); - seqRR.initialize(is); - - //Check order is defined and as expected - f = MapFileSequenceRecordReader.class.getDeclaredField("order"); - f.setAccessible(true); - int[] order = (int[]) f.get(seqRR); - assertNotNull(order); - - int[] expOrder = new int[6]; - for (int i = 0; i < expOrder.length; i++) { - expOrder[i] = i; - } - MathUtils.shuffleArray(expOrder, new Random(12345)); - assertArrayEquals(expOrder, order); - // System.out.println(Arrays.toString(expOrder)); - - count = 0; - while (seqRR.hasNext()) { - List> l = seqRR.sequenceRecord(); - assertEquals(seqMap.get(new LongWritable(expOrder[count])).getSequenceRecord(), l); - count++; - } - } - - @Test - public void testRecordReaderMultipleParts() throws Exception { - RecordReader rr = new MapFileRecordReader(); - URI uri = mapFilePath.toUri(); - InputSplit is = new FileSplit(new File(uri)); - rr.initialize(is); - - //Check number of records calculation - Field f = MapFileRecordReader.class.getDeclaredField("indexToKey"); - f.setAccessible(true); - IndexToKey itk = (IndexToKey) f.get(rr); - assertEquals(recordMap.size(), itk.getNumRecords()); - - //Check indices for each map file - List> expReaderExampleIdxs = new ArrayList<>(); - expReaderExampleIdxs.add(new Pair<>(0L, 2L)); - expReaderExampleIdxs.add(new Pair<>(-1L, -1L)); //Empty - expReaderExampleIdxs.add(new Pair<>(3L, 5L)); - - f = LongIndexToKey.class.getDeclaredField("readerIndices"); - f.setAccessible(true); - assertEquals(expReaderExampleIdxs, f.get(itk)); - - assertTrue(rr.hasNext()); - int count = 0; - while (rr.hasNext()) { - List l = rr.next(); - assertEquals(recordMap.get(new LongWritable(count)).getRecord(), l); - count++; - } - assertEquals(recordMap.size(), count); - - rr.close(); - - //Try the same thing, but with random order - rr = new MapFileRecordReader(new Random(12345)); - rr.initialize(is); - - f = MapFileRecordReader.class.getDeclaredField("order"); - f.setAccessible(true); - int[] order = (int[]) f.get(rr); - assertNotNull(order); - int[] expOrder = new int[recordMap.size()]; - for (int i = 0; i < expOrder.length; i++) { - expOrder[i] = i; - } - MathUtils.shuffleArray(expOrder, new Random(12345)); - assertArrayEquals(expOrder, order); - - count = 0; - while (rr.hasNext()) { - List l = rr.next(); - assertEquals(recordMap.get(new LongWritable(expOrder[count])).getRecord(), l); - count++; - } - } -} diff --git a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/writer/TestMapFileRecordWriter.java b/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/writer/TestMapFileRecordWriter.java deleted file mode 100644 index c21b44cbe..000000000 --- a/contrib/attic/datavec-hadoop/src/test/java/org/datavec/hadoop/records/writer/TestMapFileRecordWriter.java +++ /dev/null @@ -1,237 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.hadoop.records.writer; - -import org.nd4j.shade.guava.io.Files; -import org.datavec.api.records.converter.RecordReaderConverter; -import org.datavec.api.records.reader.RecordReader; -import org.datavec.api.records.reader.SequenceRecordReader; -import org.datavec.api.records.reader.impl.csv.CSVNLinesSequenceRecordReader; -import org.datavec.api.records.reader.impl.csv.CSVRecordReader; -import org.datavec.api.records.writer.RecordWriter; -import org.datavec.api.records.writer.SequenceRecordWriter; -import org.datavec.api.split.FileSplit; -import org.datavec.api.writable.FloatWritable; -import org.datavec.api.writable.Writable; -import org.datavec.api.writable.WritableType; -import org.datavec.hadoop.records.reader.mapfile.MapFileRecordReader; -import org.datavec.hadoop.records.reader.mapfile.MapFileSequenceRecordReader; -import org.datavec.hadoop.records.writer.mapfile.MapFileRecordWriter; -import org.datavec.hadoop.records.writer.mapfile.MapFileSequenceRecordWriter; -import org.junit.Test; -import org.nd4j.common.io.ClassPathResource; - -import java.io.File; -import java.util.ArrayList; -import java.util.List; - -import static org.junit.Assert.assertEquals; - -/** - * Created by Alex on 07/07/2017. - */ -public class TestMapFileRecordWriter { - - @Test - public void testWriter() throws Exception { - - for(boolean convertWritables : new boolean[]{false, true}) { - - File tempDirSingle = Files.createTempDir(); - File tempDirMultiple = Files.createTempDir(); - File tempDirBatch = Files.createTempDir(); - - tempDirSingle.deleteOnExit(); - tempDirMultiple.deleteOnExit(); - tempDirBatch.deleteOnExit(); - - WritableType textWritablesTo = convertWritables ? WritableType.Float : null; - - RecordWriter singlePartWriter = new MapFileRecordWriter(tempDirSingle, -1, textWritablesTo); - RecordWriter multiPartWriter = new MapFileRecordWriter(tempDirMultiple, 30, textWritablesTo); - RecordWriter multiPartBatch = new MapFileRecordWriter(tempDirBatch, 30, textWritablesTo); - - RecordReader rr = new CSVRecordReader(); - ClassPathResource cpr = new ClassPathResource("iris.dat"); - rr.initialize(new FileSplit(cpr.getFile())); - - RecordReaderConverter.convert(rr, singlePartWriter); - rr.reset(); - RecordReaderConverter.convert(rr, multiPartWriter); - - rr.reset(); - List> allLines = new ArrayList<>(); - while(rr.hasNext()){allLines.add(rr.next());} - multiPartBatch.writeBatch(allLines); - - singlePartWriter.close(); - multiPartWriter.close(); - multiPartBatch.close(); - - RecordReader rr1 = new MapFileRecordReader(); - RecordReader rr2 = new MapFileRecordReader(); - RecordReader rr3 = new MapFileRecordReader(); - rr1.initialize(new FileSplit(tempDirSingle)); - rr2.initialize(new FileSplit(tempDirMultiple)); - rr3.initialize(new FileSplit(tempDirBatch)); - - List> exp = new ArrayList<>(); - List> s1 = new ArrayList<>(); - List> s2 = new ArrayList<>(); - List> s3 = new ArrayList<>(); - - rr.reset(); - while (rr.hasNext()) { - exp.add(rr.next()); - } - - while (rr1.hasNext()) { - s1.add(rr1.next()); - } - - while (rr2.hasNext()) { - s2.add(rr2.next()); - } - - while (rr3.hasNext()) { - s3.add(rr3.next()); - } - - assertEquals(150, exp.size()); - - if(convertWritables){ - List> asFloat = new ArrayList<>(); - for(List l : exp ){ - List newList = new ArrayList<>(); - for(Writable w : l){ - newList.add(new FloatWritable(w.toFloat())); - } - asFloat.add(newList); - } - - exp = asFloat; - } - - assertEquals(exp, s1); - assertEquals(exp, s2); - assertEquals(exp, s3); - - - //By default: we won't be doing any conversion of text types. CsvRecordReader outputs Text writables - for (List l : s1) { - for (Writable w : l) { - if(convertWritables){ - assertEquals(WritableType.Float, w.getType()); - } else { - assertEquals(WritableType.Text, w.getType()); - } - } - } - } - } - - - @Test - public void testSequenceWriter() throws Exception { - - for(boolean convertWritables : new boolean[]{false, true}) { - - File tempDirSingle = Files.createTempDir(); - File tempDirMultiple = Files.createTempDir(); - - tempDirSingle.deleteOnExit(); - tempDirMultiple.deleteOnExit(); - - WritableType textWritablesTo = convertWritables ? WritableType.Float : null; - - SequenceRecordWriter singlePartWriter = new MapFileSequenceRecordWriter(tempDirSingle, -1, textWritablesTo); - SequenceRecordWriter multiPartWriter = new MapFileSequenceRecordWriter(tempDirMultiple, 10, textWritablesTo); - - SequenceRecordReader rr = new CSVNLinesSequenceRecordReader(5); - ClassPathResource cpr = new ClassPathResource("iris.dat"); - rr.initialize(new FileSplit(cpr.getFile())); - - RecordReaderConverter.convert(rr, singlePartWriter); - rr.reset(); - RecordReaderConverter.convert(rr, multiPartWriter); - - singlePartWriter.close(); - multiPartWriter.close(); - - SequenceRecordReader rr1 = new MapFileSequenceRecordReader(); - SequenceRecordReader rr2 = new MapFileSequenceRecordReader(); - rr1.initialize(new FileSplit(tempDirSingle)); - rr2.initialize(new FileSplit(tempDirMultiple)); - - List>> exp = new ArrayList<>(); - List>> s1 = new ArrayList<>(); - List>> s2 = new ArrayList<>(); - - rr.reset(); - while (rr.hasNext()) { - exp.add(rr.sequenceRecord()); - } - - while (rr1.hasNext()) { - s1.add(rr1.sequenceRecord()); - } - - while (rr2.hasNext()) { - s2.add(rr2.sequenceRecord()); - } - - assertEquals(150/5, exp.size()); - - if(convertWritables){ - List>> asFloat = new ArrayList<>(); - for(List> sequence : exp ){ - List> newSeq = new ArrayList<>(); - for(List step : sequence ){ - List newStep = new ArrayList<>(); - for(Writable w : step){ - newStep.add(new FloatWritable(w.toFloat())); - } - newSeq.add(newStep); - } - asFloat.add(newSeq); - } - exp = asFloat; - } - - assertEquals(exp, s1); - assertEquals(exp, s2); - - - //By default: we won't be doing any conversion of text types. CsvRecordReader outputs Text writables - for(List> seq : s1) { - for (List l : seq) { - for (Writable w : l) { - if (convertWritables) { - assertEquals(WritableType.Float, w.getType()); - } else { - assertEquals(WritableType.Text, w.getType()); - } - } - } - } - } - } - - -} diff --git a/contrib/attic/datavec-hadoop/src/test/resources/log4j.properties b/contrib/attic/datavec-hadoop/src/test/resources/log4j.properties deleted file mode 100644 index c5d6b5f4c..000000000 --- a/contrib/attic/datavec-hadoop/src/test/resources/log4j.properties +++ /dev/null @@ -1,44 +0,0 @@ -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -log4j.rootLogger=ERROR, Console -log4j.logger.play=DEBUG -log4j.appender.Console=org.apache.log4j.ConsoleAppender -log4j.appender.Console.layout=org.apache.log4j.PatternLayout -log4j.appender.Console.layout.ConversionPattern=%d{ABSOLUTE} %-5p ~ %m%n - -log4j.appender.org.springframework=DEBUG -log4j.appender.org.nd4j=INFO -log4j.appender.org.canova=INFO -log4j.appender.org.datavec=INFO -log4j.appender.org.deeplearning4j=INFO -log4j.appender.opennlp.uima=OFF -log4j.appender.org.apache.uima=OFF -log4j.appender.org.cleartk=OFF - -log4j.logger.org.springframework=INFO -log4j.logger.org.nd4j=INFO -log4j.logger.org.canova=INFO -log4j.logger.org.datavec=INFO -log4j.logger.org.apache.spark=WARN -log4j.logger.org.deeplearning4j=INFO -log4j.logger.opennlp.uima.util=OFF -log4j.logger.org.apache.uima=OFF -log4j.logger.org.cleartk=OFF \ No newline at end of file diff --git a/contrib/attic/datavec-hadoop/src/test/resources/logback.xml b/contrib/attic/datavec-hadoop/src/test/resources/logback.xml deleted file mode 100644 index abb9912c7..000000000 --- a/contrib/attic/datavec-hadoop/src/test/resources/logback.xml +++ /dev/null @@ -1,53 +0,0 @@ - - - - - - logs/application.log - - %date - [%level] - from %logger in %thread - %n%message%n%xException%n - - - - - - %logger{15} - %message%n%xException{5} - - - - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/contrib/attic/datavec-python/pom.xml b/contrib/attic/datavec-python/pom.xml deleted file mode 100644 index dbee86fee..000000000 --- a/contrib/attic/datavec-python/pom.xml +++ /dev/null @@ -1,78 +0,0 @@ - - - - - - 4.0.0 - - - org.datavec - datavec-parent - 1.0.0-SNAPSHOT - - - datavec-python - - - - org.json - json - 20190722 - - - org.bytedeco - cpython-platform - ${cpython-platform.version} - - - org.bytedeco - numpy-platform - ${numpy.javacpp.version} - - - com.google.code.findbugs - jsr305 - 3.0.2 - - - org.datavec - datavec-api - ${project.version} - - - - org.nd4j - nd4j-native-api - ${project.version} - - - - - - test-nd4j-native - - - test-nd4j-cuda-11.0 - - - diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/NumpyArray.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/NumpyArray.java deleted file mode 100644 index f7442c38a..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/NumpyArray.java +++ /dev/null @@ -1,149 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import lombok.Builder; -import lombok.Getter; -import lombok.NoArgsConstructor; -import org.apache.commons.lang3.ArrayUtils; -import org.bytedeco.javacpp.Pointer; -import org.nd4j.linalg.api.buffer.DataBuffer; -import org.nd4j.linalg.api.concurrency.AffinityManager; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.api.shape.Shape; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.nativeblas.NativeOps; -import org.nd4j.nativeblas.NativeOpsHolder; -import org.nd4j.linalg.api.buffer.DataType; - -import java.util.Arrays; -import java.util.HashMap; -import java.util.Map; - -import static org.nd4j.linalg.api.buffer.DataType.FLOAT; - - -/** - * Wrapper around INDArray for initializing from numpy array - * - * @author Fariz Rahman - */ -@Getter -@NoArgsConstructor -public class NumpyArray { - - private static NativeOps nativeOps; - private static Map arrayCache; // Avoids re-allocation of device buffer - private long address; - private long[] shape; - private long[] strides; - private DataType dtype; - private INDArray nd4jArray; - - static { - //initialize - Nd4j.scalar(1.0); - nativeOps = NativeOpsHolder.getInstance().getDeviceNativeOps(); - arrayCache = new HashMap<>(); - } - - @Builder - public NumpyArray(long address, long[] shape, long strides[], DataType dtype, boolean copy) { - this.address = address; - this.shape = shape; - this.strides = strides; - this.dtype = dtype; - setND4JArray(); - if (copy) { - nd4jArray = nd4jArray.dup(); - Nd4j.getAffinityManager().ensureLocation(nd4jArray, AffinityManager.Location.HOST); - this.address = nd4jArray.data().address(); - } - } - - - - public NumpyArray copy() { - return new NumpyArray(nd4jArray.dup()); - } - - public NumpyArray(long address, long[] shape, long strides[]) { - this(address, shape, strides, FLOAT, false); - } - - public NumpyArray(long address, long[] shape, long strides[], DataType dtype) { - this(address, shape, strides, dtype, false); - } - - - private void setND4JArray() { - - long size = 1; - for (long d : shape) { - size *= d; - } - - String cacheKey = address + "_" + size + "_" + dtype + "_" + ArrayUtils.toString(strides); - nd4jArray = arrayCache.get(cacheKey); - if (nd4jArray == null) { - Pointer ptr = nativeOps.pointerForAddress(address); - ptr = ptr.limit(size); - ptr = ptr.capacity(size); - DataBuffer buff = Nd4j.createBuffer(ptr, size, dtype); - - int elemSize = buff.getElementSize(); - long[] nd4jStrides = new long[strides.length]; - for (int i = 0; i < strides.length; i++) { - nd4jStrides[i] = strides[i] / elemSize; - } - - nd4jArray = Nd4j.create(buff, shape, nd4jStrides, 0, Shape.getOrder(shape, nd4jStrides, 1), dtype); - arrayCache.put(cacheKey, nd4jArray); - } - else{ - if (!Arrays.equals(nd4jArray.shape(), shape)){ - nd4jArray = nd4jArray.reshape(shape); - } - } - Nd4j.getAffinityManager().ensureLocation(nd4jArray, AffinityManager.Location.HOST); - } - - public INDArray getNd4jArray(){ - Nd4j.getAffinityManager().tagLocation(nd4jArray, AffinityManager.Location.HOST); - return nd4jArray; - } - - public NumpyArray(INDArray nd4jArray) { - Nd4j.getAffinityManager().ensureLocation(nd4jArray, AffinityManager.Location.HOST); - DataBuffer buff = nd4jArray.data(); - address = buff.pointer().address(); - shape = nd4jArray.shape(); - long[] nd4jStrides = nd4jArray.stride(); - strides = new long[nd4jStrides.length]; - int elemSize = buff.getElementSize(); - for (int i = 0; i < strides.length; i++) { - strides[i] = nd4jStrides[i] * elemSize; - } - dtype = nd4jArray.dataType(); - this.nd4jArray = nd4jArray; - String cacheKey = address + "_" + nd4jArray.length() + "_" + dtype + "_" + ArrayUtils.toString(strides); - arrayCache.put(cacheKey, nd4jArray); - } - -} \ No newline at end of file diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/Python.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/Python.java deleted file mode 100644 index 09e502c99..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/Python.java +++ /dev/null @@ -1,277 +0,0 @@ - -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - - -import org.bytedeco.cpython.PyObject; - -import static org.bytedeco.cpython.global.python.*; -import static org.bytedeco.numpy.global.numpy.PyArray_EnsureArray; - -/** - * Swift like python wrapper for Java - * - * @author Fariz Rahman - */ - -public class Python { - - /** - * Imports a python module, similar to python import statement. - * @param moduleName name of the module to be imported - * @return reference to the module object - * @throws PythonException - */ - public static PythonObject importModule(String moduleName) throws PythonException{ - PythonObject module = new PythonObject(PyImport_ImportModule(moduleName)); - if (module.isNone()) { - throw new PythonException("Error importing module: " + moduleName); - } - return module; - } - - public static PythonObject attr(String attrName) { - return builtins().attr(attrName); - } - - public static PythonObject len(PythonObject pythonObject) { - return attr("len").call(pythonObject); - } - - public static PythonObject str(PythonObject pythonObject) { - return attr("str").call(pythonObject); - } - - public static PythonObject str() { - return attr("str").call(); - } - - public static PythonObject strType() { - return attr("str"); - } - - public static PythonObject float_(PythonObject pythonObject) { - return attr("float").call(pythonObject); - } - - public static PythonObject float_() { - return attr("float").call(); - } - - public static PythonObject floatType() { - return attr("float"); - } - - public static PythonObject bool(PythonObject pythonObject) { - return attr("bool").call(pythonObject); - } - - public static PythonObject bool() { - return attr("bool").call(); - } - - public static PythonObject boolType() { - return attr("bool"); - } - - public static PythonObject int_(PythonObject pythonObject) { - return attr("int").call(pythonObject); - } - - public static PythonObject int_() { - return attr("int").call(); - } - - public static PythonObject intType() { - return attr("int"); - } - - public static PythonObject list(PythonObject pythonObject) { - return attr("list").call(pythonObject); - } - - public static PythonObject list() { - return attr("list").call(); - } - - public static PythonObject listType() { - return attr("list"); - } - - public static PythonObject dict(PythonObject pythonObject) { - return attr("dict").call(pythonObject); - } - - public static PythonObject dict() { - return attr("dict").call(); - } - - public static PythonObject dictType() { - return attr("dict"); - } - - public static PythonObject set(PythonObject pythonObject) { - return attr("set").call(pythonObject); - } - - public static PythonObject set() { - return attr("set").call(); - } - - public static PythonObject bytearray(PythonObject pythonObject) { - return attr("bytearray").call(pythonObject); - } - - public static PythonObject bytearray() { - return attr("bytearray").call(); - } - - public static PythonObject bytearrayType() { - return attr("bytearray"); - } - - public static PythonObject memoryview(PythonObject pythonObject) { - return attr("memoryview").call(pythonObject); - } - - public static PythonObject memoryviewType() { - return attr("memoryview"); - } - - public static PythonObject bytes(PythonObject pythonObject) { - return attr("bytes").call(pythonObject); - } - - public static PythonObject bytes() { - return attr("bytes").call(); - } - - public static PythonObject bytesType() { - return attr("bytes"); - } - - public static PythonObject tuple(PythonObject pythonObject) { - return attr("tuple").call(pythonObject); - } - - public static PythonObject tuple() { - return attr("tuple").call(); - } - - - public static PythonObject Exception(PythonObject pythonObject) { - return attr("Exception").call(pythonObject); - } - - public static PythonObject Exception() { - return attr("Exception").call(); - } - - public static PythonObject ExceptionType() { - return attr("Exception"); - } - - - public static PythonObject tupleType() { - return attr("tuple"); - } - public static PythonObject globals() { - return new PythonObject(PyModule_GetDict(PyImport_ImportModule("__main__"))); - } - - public static PythonObject type(PythonObject obj) { - return attr("type").call(obj); - } - - public static boolean isinstance(PythonObject obj, PythonObject... type) { - return PyObject_IsInstance(obj.getNativePythonObject(), - PyList_AsTuple(new PythonObject(type).getNativePythonObject())) != 0; - } - - public static PythonObject eval(String code) { - PyObject compiledCode = Py_CompileString(code, "", Py_eval_input); - PyObject globals = globals().getNativePythonObject(); - PyObject locals = Python.dict().getNativePythonObject(); - return new PythonObject(PyEval_EvalCode(compiledCode, globals, locals)); - } - - - public static PythonObject builtins(){ - try{ - return importModule("builtins"); - }catch (PythonException pe){ - throw new IllegalStateException("Unable to import builtins: " + pe); // this should never happen - } - - } - - public static PythonObject None() { - return dict().attr("get").call(0); - } - - public static PythonObject True() { - return boolType().call(1); - } - - public static PythonObject False() { - return boolType().call(0); - - } - - public static PythonObject ndarray(PythonObject pythonObject){ - return new PythonObject(PyArray_EnsureArray(pythonObject.getNativePythonObject())); - } - - public static boolean callable(PythonObject pythonObject) { - return PyCallable_Check(pythonObject.getNativePythonObject()) == 1; - } - - - public static void setContext(String context) throws PythonException{ - PythonContextManager.setContext(context); - } - - public static String getCurrentContext(){ - return PythonContextManager.getCurrentContext(); - } - - public static void deleteContext(String context) throws PythonException{ - PythonContextManager.deleteContext(context); - } - - public static void deleteNonMainContexts(){ - PythonContextManager.deleteNonMainContexts(); - } - - public static void setMainContext(){PythonContextManager.setMainContext();} - - public static void exec(String code)throws PythonException{ - PythonExecutioner.exec(code); - } - public static void exec(String code, PythonVariables inputs, PythonVariables outputs) throws PythonException{ - PythonExecutioner.exec(code, inputs, outputs); - } - - public static PythonGIL lock(){ - return PythonGIL.lock(); - } - - -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonCondition.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonCondition.java deleted file mode 100644 index c7b32a280..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonCondition.java +++ /dev/null @@ -1,164 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.datavec.api.transform.condition.Condition; -import org.datavec.api.transform.schema.Schema; -import org.datavec.api.writable.*; -import java.util.List; - -import static org.datavec.python.PythonUtils.schemaToPythonVariables; -import static org.nd4j.common.base.Preconditions.checkNotNull; -import static org.nd4j.common.base.Preconditions.checkState; - -/** - * Lets a condition be defined as a python method f that takes no arguments - * and returns a boolean indicating whether or not to filter a row. - * The values of all columns in current row are available as global variables to f. - * - * @author Fariz Rahman - */ -public class PythonCondition implements Condition { - - private Schema inputSchema; - private PythonVariables pyInputs; - private PythonTransform pythonTransform; - private String code; - - - public PythonCondition(String pythonCode) { - checkNotNull("Python code must not be null!", pythonCode); - checkState(!pythonCode.isEmpty(), "Python code must not be empty!"); - code = pythonCode; - } - - - @Override - public void setInputSchema(Schema inputSchema) { - this.inputSchema = inputSchema; - try { - pyInputs = schemaToPythonVariables(inputSchema); - PythonVariables pyOuts = new PythonVariables(); - pyOuts.addInt("out"); - pythonTransform = PythonTransform.builder() - .code(code + "\n\nout=f()\nout=0 if out is None else int(out)") - .inputs(pyInputs) - .outputs(pyOuts) - .build(); - - } catch (Exception e) { - throw new RuntimeException(e); - } - - - } - - @Override - public Schema getInputSchema() { - return inputSchema; - } - - @Override - public String[] outputColumnNames() { - String[] columnNames = new String[inputSchema.numColumns()]; - inputSchema.getColumnNames().toArray(columnNames); - return columnNames; - } - - @Override - public String outputColumnName() { - return outputColumnNames()[0]; - } - - @Override - public String[] columnNames() { - return outputColumnNames(); - } - - @Override - public String columnName() { - return outputColumnName(); - } - - @Override - public Schema transform(Schema inputSchema) { - return inputSchema; - } - - @Override - public boolean condition(List list) { - PythonVariables inputs = getPyInputsFromWritables(list); - try { - pythonTransform.getPythonJob().exec(inputs, pythonTransform.getOutputs()); - boolean ret = pythonTransform.getOutputs().getIntValue("out") != 0; - return ret; - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public boolean condition(Object input) { - return condition(input); - } - - @Override - public boolean conditionSequence(List> list) { - throw new UnsupportedOperationException("not supported"); - } - - - @Override - public boolean conditionSequence(Object input) { - throw new UnsupportedOperationException("not supported"); - } - - private PythonVariables getPyInputsFromWritables(List writables) { - PythonVariables ret = new PythonVariables(); - - for (int i = 0; i < inputSchema.numColumns(); i++) { - String name = inputSchema.getName(i); - Writable w = writables.get(i); - PythonType pyType = pyInputs.getType(inputSchema.getName(i)); - switch (pyType.getName()) { - case INT: - if (w instanceof LongWritable) { - ret.addInt(name, ((LongWritable) w).get()); - } else { - ret.addInt(name, ((IntWritable) w).get()); - } - - break; - case FLOAT: - ret.addFloat(name, ((DoubleWritable) w).get()); - break; - case STR: - ret.addStr(name, w.toString()); - break; - case NDARRAY: - ret.addNDArray(name, ((NDArrayWritable) w).get()); - break; - } - } - - return ret; - } - - -} \ No newline at end of file diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonContextManager.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonContextManager.java deleted file mode 100644 index aee85c659..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonContextManager.java +++ /dev/null @@ -1,190 +0,0 @@ - -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - - -package org.datavec.python; - - -import java.util.HashSet; -import java.util.Set; -import java.util.concurrent.atomic.AtomicBoolean; - -/** - * Emulates multiples interpreters in a single interpreter. - * This works by simply obfuscating/de-obfuscating variable names - * such that only the required subset of the global namespace is "visible" - * at any given time. - * By default, there exists a "main" context emulating the default interpreter - * and cannot be deleted. - * @author Fariz Rahman - */ - - -public class PythonContextManager { - - private static Set contexts = new HashSet<>(); - private static AtomicBoolean init = new AtomicBoolean(false); - private static String currentContext; - private static final String MAIN_CONTEXT = "main"; - static { - init(); - } - - private static void init() { - if (init.get()) return; - new PythonExecutioner(); - init.set(true); - currentContext = MAIN_CONTEXT; - contexts.add(currentContext); - } - - - public static void addContext(String contextName) throws PythonException { - if (!validateContextName(contextName)) { - throw new PythonException("Invalid context name: " + contextName); - } - contexts.add(contextName); - } - - public static boolean hasContext(String contextName) { - return contexts.contains(contextName); - } - - - public static boolean validateContextName(String s) { - if (s.length() == 0) return false; - if (!Character.isJavaIdentifierStart(s.charAt(0))) return false; - for (int i = 1; i < s.length(); i++) - if (!Character.isJavaIdentifierPart(s.charAt(i))) - return false; - return true; - } - - private static String getContextPrefix(String contextName) { - return "__collapsed__" + contextName + "__"; - } - - private static String getCollapsedVarNameForContext(String varName, String contextName) { - return getContextPrefix(contextName) + varName; - } - - private static String expandCollapsedVarName(String varName, String contextName) { - String prefix = "__collapsed__" + contextName + "__"; - return varName.substring(prefix.length()); - - } - - private static void collapseContext(String contextName) { - PythonObject globals = Python.globals(); - PythonObject keysList = Python.list(globals.attr("keys").call()); - int numKeys = Python.len(keysList).toInt(); - for (int i = 0; i < numKeys; i++) { - PythonObject key = keysList.get(i); - String keyStr = key.toString(); - if (!((keyStr.startsWith("__") && keyStr.endsWith("__")) || keyStr.startsWith("__collapsed_"))) { - String collapsedKey = getCollapsedVarNameForContext(keyStr, contextName); - PythonObject val = globals.attr("pop").call(key); - globals.set(new PythonObject(collapsedKey), val); - } - } - } - - private static void expandContext(String contextName) { - String prefix = getContextPrefix(contextName); - PythonObject globals = Python.globals(); - PythonObject keysList = Python.list(globals.attr("keys").call()); - int numKeys = Python.len(keysList).toInt(); - for (int i = 0; i < numKeys; i++) { - PythonObject key = keysList.get(i); - String keyStr = key.toString(); - if (keyStr.startsWith(prefix)) { - String expandedKey = expandCollapsedVarName(keyStr, contextName); - PythonObject val = globals.attr("pop").call(key); - globals.set(new PythonObject(expandedKey), val); - } - } - - } - - public static void setContext(String contextName) throws PythonException{ - if (contextName.equals(currentContext)) { - return; - } - if (!hasContext(contextName)) { - addContext(contextName); - } - collapseContext(currentContext); - expandContext(contextName); - currentContext = contextName; - - } - - public static void setMainContext() { - try{ - setContext(MAIN_CONTEXT); - } - catch (PythonException pe){ - throw new RuntimeException(pe); - } - - } - - public static String getCurrentContext() { - return currentContext; - } - - public static void deleteContext(String contextName) throws PythonException { - if (contextName.equals(MAIN_CONTEXT)) { - throw new PythonException("Can not delete main context!"); - } - if (contextName.equals(currentContext)) { - throw new PythonException("Can not delete current context!"); - } - String prefix = getContextPrefix(contextName); - PythonObject globals = Python.globals(); - PythonObject keysList = Python.list(globals.attr("keys").call()); - int numKeys = Python.len(keysList).toInt(); - for (int i = 0; i < numKeys; i++) { - PythonObject key = keysList.get(i); - String keyStr = key.toString(); - if (keyStr.startsWith(prefix)) { - globals.attr("__delitem__").call(key); - } - } - contexts.remove(contextName); - } - - public static void deleteNonMainContexts() { - try{ - setContext(MAIN_CONTEXT); // will never fail - for (String c : contexts.toArray(new String[0])) { - if (!c.equals(MAIN_CONTEXT)) { - deleteContext(c); // will never fail - } - } - }catch(Exception e){ - throw new RuntimeException(e); - } - } - - public String[] getContexts() { - return contexts.toArray(new String[0]); - } - -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonException.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonException.java deleted file mode 100644 index 2ae4d78a8..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonException.java +++ /dev/null @@ -1,46 +0,0 @@ - -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -/** - * Thrown when an exception occurs in python land - */ -public class PythonException extends Exception { - public PythonException(String message){ - super(message); - } - private static String getExceptionString(PythonObject exception){ - if (Python.isinstance(exception, Python.ExceptionType())){ - String exceptionClass = Python.type(exception).attr("__name__").toString(); - String message = exception.toString(); - return exceptionClass + ": " + message; - } - return exception.toString(); - } - public PythonException(PythonObject exception){ - this(getExceptionString(exception)); - } - public PythonException(String message, Throwable cause){ - super(message, cause); - } - public PythonException(Throwable cause){ - super(cause); - } -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonExecutioner.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonExecutioner.java deleted file mode 100644 index 8bd2ab40c..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonExecutioner.java +++ /dev/null @@ -1,402 +0,0 @@ - -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - - -package org.datavec.python; - -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.io.IOUtils; -import org.bytedeco.cpython.global.python; -import org.bytedeco.numpy.global.numpy; -import org.nd4j.common.io.ClassPathResource; - -import java.io.File; -import java.io.IOException; -import java.io.InputStream; -import java.nio.charset.Charset; -import java.util.concurrent.atomic.AtomicBoolean; - -import static org.bytedeco.cpython.global.python.*; -import static org.datavec.python.Python.*; - -/** - * Allows execution of python scripts managed by - * an internal interpreter. - * An end user may specify a python script to run - * via any of the execution methods available in this class. - * - * At static initialization time (when the class is first initialized) - * a number of components are setup: - * 1. The python path. A user may over ride this with the system property {@link #DEFAULT_PYTHON_PATH_PROPERTY} - * - * 2. Since this executioner uses javacpp to manage and run python interpreters underneath the covers, - * a user may also over ride the system property {@link #JAVACPP_PYTHON_APPEND_TYPE} with one of the {@link JavaCppPathType} - * values. This will allow the user to determine whether the javacpp default python path is used at all, and if so - * whether it is appended, prepended, or not used. This behavior is useful when you need to use an external - * python distribution such as anaconda. - * - * 3. A proper numpy import for use with javacpp: We call numpy import ourselves to ensure proper loading of - * native libraries needed by numpy are allowed to load in the proper order. If we don't do this, - * it causes a variety of issues with running numpy. (User must still include "import numpy as np" in their scripts). - * - * 4. Various python scripts pre defined on the classpath included right with the java code. - * These are auxillary python scripts used for loading classes, pre defining certain kinds of behavior - * in order for us to manipulate values within the python memory, as well as pulling them out of memory - * for integration within the internal python executioner. - * - * For more information on how this works, please take a look at the {@link #init()} - * method. - * - * Generally, a user defining a python script for use by the python executioner - * will have a set of defined target input values and output values. - * These values should not be present when actually running the script, but just referenced. - * In order to test your python script for execution outside the engine, - * we recommend commenting out a few default values as dummy input values. - * This will allow an end user to test their script before trying to use the server. - * - * In order to get output values out of a python script, all a user has to do - * is define the output variables they want being used in the final output in the actual pipeline. - * For example, if a user wants to return a dictionary, they just have to create a dictionary with that name - * and based on the configured {@link PythonVariables} passed as outputs - * to one of the execution methods, we can pull the values out automatically. - * - * For input definitions, it is similar. You just define the values you want used in - * {@link PythonVariables} and we will automatically generate code for defining those values - * as desired for running. This allows the user to customize values dynamically - * at runtime but reference them by name in a python script. - * - * - * @author Fariz Rahman - * @author Adam Gibson - */ - - -@Slf4j -public class PythonExecutioner { - - - private static AtomicBoolean init = new AtomicBoolean(false); - public final static String DEFAULT_PYTHON_PATH_PROPERTY = "org.datavec.python.path"; - public final static String JAVACPP_PYTHON_APPEND_TYPE = "org.datavec.python.javacpp.path.append"; - public final static String DEFAULT_APPEND_TYPE = "before"; - private final static String PYTHON_EXCEPTION_KEY = "__python_exception__"; - - static { - init(); - } - - - private static synchronized void init() { - if (init.get()) { - return; - } - initPythonPath(); - init.set(true); - log.info("CPython: PyEval_InitThreads()"); - PyEval_InitThreads(); - log.info("CPython: Py_InitializeEx()"); - Py_InitializeEx(0); - numpy._import_array(); - } - - private static synchronized void simpleExec(String code) throws PythonException{ - log.debug(code); - log.info("CPython: PyRun_SimpleStringFlag()"); - - int result = PyRun_SimpleStringFlags(code, null); - if (result != 0) { - throw new PythonException("Execution failed, unable to retrieve python exception."); - } - } - - public static boolean validateVariableName(String s) { - if (s.isEmpty()) return false; - if (!Character.isJavaIdentifierStart(s.charAt(0))) return false; - for (int i = 1; i < s.length(); i++) - if (!Character.isJavaIdentifierPart(s.charAt(i))) - return false; - return true; - } - - - /** - * Sets a variable in the global scope of the current context (See @PythonContextManager). - * This is equivalent to `exec("a = b");` where a is the variable name - * and b is the variable value. - * @param varName Name of the python variable being set. Should be a valid python identifier string - * @param pythonObject Value for the python variable - * @throws Exception - */ - public static void setVariable(String varName, PythonObject pythonObject) throws PythonException{ - if (!validateVariableName(varName)){ - throw new PythonException("Invalid variable name: " + varName); - } - Python.globals().set(new PythonObject(varName), pythonObject); - } - - public static void setVariable(String varName, PythonType varType, Object value) throws PythonException { - PythonObject pythonObject; - switch (varType.getName()) { - case STR: - pythonObject = new PythonObject(PythonType.STR.convert(value)); - break; - case INT: - pythonObject = new PythonObject(PythonType.INT.convert(value)); - break; - case FLOAT: - pythonObject = new PythonObject(PythonType.FLOAT.convert(value)); - break; - case BOOL: - pythonObject = new PythonObject(PythonType.BOOL.convert(value)); - break; - case NDARRAY: - pythonObject = new PythonObject(PythonType.NDARRAY.convert(value)); - break; - case LIST: - pythonObject = new PythonObject(PythonType.LIST.convert(value)); - break; - case DICT: - pythonObject = new PythonObject(PythonType.DICT.convert(value)); - break; - case BYTES: - pythonObject = new PythonObject(PythonType.BYTES.convert(value)); - break; - default: - throw new PythonException("Unsupported type: " + varType); - - } - setVariable(varName, pythonObject); - } - - public static void setVariables(PythonVariables pyVars) throws PythonException{ - if (pyVars == null) return; - for (String varName : pyVars.getVariables()) { - setVariable(varName, pyVars.getType(varName), pyVars.getValue(varName)); - } - } - - public static PythonObject getVariable(String varName) { - return Python.globals().attr("get").call(varName); - } - - public static T getVariable(String varName, PythonType varType) throws PythonException{ - PythonObject pythonObject = getVariable(varName); - return varType.toJava(pythonObject); - } - - public static void getVariables(PythonVariables pyVars) throws PythonException { - if (pyVars == null){ - return; - } - for (String varName : pyVars.getVariables()) { - pyVars.setValue(varName, getVariable(varName, pyVars.getType(varName))); - } - } - - - private static String getWrappedCode(String code) { - try (InputStream is = new ClassPathResource("pythonexec/pythonexec.py").getInputStream()) { - String base = IOUtils.toString(is, Charset.defaultCharset()); - StringBuffer indentedCode = new StringBuffer(); - for (String split : code.split("\n")) { - indentedCode.append(" " + split + "\n"); - - } - - String out = base.replace(" pass", indentedCode); - return out; - } catch (IOException e) { - throw new IllegalStateException("Unable to read python code!", e); - } - - } - - private static void throwIfExecutionFailed() throws PythonException{ - PythonObject ex = getVariable(PYTHON_EXCEPTION_KEY); - if (ex != null && !ex.isNone() && !ex.toString().isEmpty()) { - setVariable(PYTHON_EXCEPTION_KEY, new PythonObject("")); - throw new PythonException(ex); - } - } - - public static void exec(String code) throws PythonException { - simpleExec(getWrappedCode(code)); - throwIfExecutionFailed(); - } - - public static void exec(String code, PythonVariables inputVariables, PythonVariables outputVariables) throws PythonException { - setVariables(inputVariables); - simpleExec(getWrappedCode(code)); - throwIfExecutionFailed(); - getVariables(outputVariables); - } - - public static PythonVariables execAndReturnAllVariables(String code) throws PythonException { - simpleExec(getWrappedCode(code)); - throwIfExecutionFailed(); - PythonVariables out = new PythonVariables(); - PythonObject globals = Python.globals(); - PythonObject keysList = Python.list(globals.attr("keys")); - int numKeys = Python.len(keysList).toInt(); - for (int i = 0; i < numKeys; i++) { - PythonObject key = keysList.get(i); - String keyStr = key.toString(); - if (!keyStr.startsWith("_")) { - PythonObject val = globals.get(key); - if (Python.isinstance(val, intType())) { - out.addInt(keyStr, val.toInt()); - } else if (Python.isinstance(val, floatType())) { - out.addFloat(keyStr, val.toDouble()); - } else if (Python.isinstance(val, strType())) { - out.addStr(keyStr, val.toString()); - } else if (Python.isinstance(val, boolType())) { - out.addBool(keyStr, val.toBoolean()); - } else if (Python.isinstance(val, listType())) { - out.addList(keyStr, val.toList().toArray(new Object[0])); - } else if (Python.isinstance(val, dictType())) { - out.addDict(keyStr, val.toMap()); - } - } - } - return out; - - } - - public static PythonVariables getAllVariables() throws PythonException{ - PythonVariables out = new PythonVariables(); - PythonObject globals = Python.globals(); - PythonObject keysList = Python.list(globals.attr("keys").call()); - int numKeys = Python.len(keysList).toInt(); - for (int i = 0; i < numKeys; i++) { - PythonObject key = keysList.get(i); - String keyStr = key.toString(); - if (!keyStr.startsWith("_")) { - PythonObject val = globals.get(key); - if (Python.isinstance(val, intType())) { - out.addInt(keyStr, val.toInt()); - } else if (Python.isinstance(val, floatType())) { - out.addFloat(keyStr, val.toDouble()); - } else if (Python.isinstance(val, strType())) { - out.addStr(keyStr, val.toString()); - } else if (Python.isinstance(val, boolType())) { - out.addBool(keyStr, val.toBoolean()); - } else if (Python.isinstance(val, listType())) { - out.addList(keyStr, val.toList().toArray(new Object[0])); - } else if (Python.isinstance(val, dictType())) { - out.addDict(keyStr, val.toMap()); - } else { - PythonObject np = importModule("numpy"); - if (Python.isinstance(val, np.attr("ndarray"), np.attr("generic"))) { - out.addNDArray(keyStr, val.toNumpy()); - } - } - - } - } - return out; - } - - public static PythonVariables execAndReturnAllVariables(String code, PythonVariables inputs) throws Exception{ - setVariables(inputs); - simpleExec(getWrappedCode(code)); - return getAllVariables(); - } - - /** - * One of a few desired values - * for how we should handle - * using javacpp's python path. - * BEFORE: Prepend the python path alongside a defined one - * AFTER: Append the javacpp python path alongside the defined one - * NONE: Don't use javacpp's python path at all - */ - public enum JavaCppPathType { - BEFORE, AFTER, NONE - } - - /** - * Set the python path. - * Generally you can just use the PYTHONPATH environment variable, - * but if you need to set it from code, this can work as well. - */ - - public static synchronized void initPythonPath() { - if (!init.get()) { - try { - String path = System.getProperty(DEFAULT_PYTHON_PATH_PROPERTY); - if (path == null) { - log.info("Setting python default path"); - File[] packages = numpy.cachePackages(); - - //// TODO: fix in javacpp - File sitePackagesWindows = new File(python.cachePackage(), "site-packages"); - File[] packages2 = new File[packages.length + 1]; - System.arraycopy(packages, 0, packages2, 0, packages.length); - packages2[packages.length] = sitePackagesWindows; - //System.out.println(sitePackagesWindows.getAbsolutePath()); - packages = packages2; - ////////// - - Py_SetPath(packages); - } else { - log.info("Setting python path " + path); - StringBuffer sb = new StringBuffer(); - File[] packages = numpy.cachePackages(); - JavaCppPathType pathAppendValue = JavaCppPathType.valueOf(System.getProperty(JAVACPP_PYTHON_APPEND_TYPE, DEFAULT_APPEND_TYPE).toUpperCase()); - switch (pathAppendValue) { - case BEFORE: - for (File cacheDir : packages) { - sb.append(cacheDir); - sb.append(java.io.File.pathSeparator); - } - - sb.append(path); - - log.info("Prepending javacpp python path: {}", sb.toString()); - break; - case AFTER: - sb.append(path); - - for (File cacheDir : packages) { - sb.append(cacheDir); - sb.append(java.io.File.pathSeparator); - } - - log.info("Appending javacpp python path " + sb.toString()); - break; - case NONE: - log.info("Not appending javacpp path"); - sb.append(path); - break; - } - - //prepend the javacpp packages - log.info("Final python path: {}", sb.toString()); - - Py_SetPath(sb.toString()); - } - } catch (IOException e) { - log.error("Failed to set python path.", e); - } - } else { - throw new IllegalStateException("Unable to reset python path. Already initialized."); - } - } - -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonGIL.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonGIL.java deleted file mode 100644 index ac50d99c6..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonGIL.java +++ /dev/null @@ -1,68 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - - -package org.datavec.python; - -import lombok.extern.slf4j.Slf4j; -import org.bytedeco.cpython.PyThreadState; - -import static org.bytedeco.cpython.global.python.*; - - -@Slf4j -public class PythonGIL implements AutoCloseable { - private static PyThreadState mainThreadState; - - static { - log.debug("CPython: PyThreadState_Get()"); - mainThreadState = PyThreadState_Get(); - } - - private PythonGIL() { - acquire(); - } - - @Override - public void close() { - release(); - } - - public static PythonGIL lock() { - return new PythonGIL(); - } - - private static synchronized void acquire() { - log.debug("acquireGIL()"); - log.debug("CPython: PyEval_SaveThread()"); - mainThreadState = PyEval_SaveThread(); - log.debug("CPython: PyThreadState_New()"); - PyThreadState ts = PyThreadState_New(mainThreadState.interp()); - log.debug("CPython: PyEval_RestoreThread()"); - PyEval_RestoreThread(ts); - log.debug("CPython: PyThreadState_Swap()"); - PyThreadState_Swap(ts); - } - - private static synchronized void release() { - log.debug("CPython: PyEval_SaveThread()"); - PyEval_SaveThread(); - log.debug("CPython: PyEval_RestoreThread()"); - PyEval_RestoreThread(mainThreadState); - } -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonJob.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonJob.java deleted file mode 100644 index 637604200..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonJob.java +++ /dev/null @@ -1,171 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - - -package org.datavec.python; - -import lombok.Builder; -import lombok.Data; -import lombok.NoArgsConstructor; - -import javax.annotation.Nonnull; - - -@Data -@NoArgsConstructor -/** - * PythonJob is the right abstraction for executing multiple python scripts - * in a multi thread stateful environment. The setup-and-run mode allows your - * "setup" code (imports, model loading etc) to be executed only once. - */ -public class PythonJob { - - private String code; - private String name; - private String context; - private boolean setupRunMode; - private PythonObject runF; - - static { - new PythonExecutioner(); - } - - @Builder - /** - * @param name Name for the python job. - * @param code Python code. - * @param setupRunMode If true, the python code is expected to have two methods: setup(), which takes no arguments, - * and run() which takes some or no arguments. setup() method is executed once, - * and the run() method is called with the inputs(if any) per transaction, and is expected to return a dictionary - * mapping from output variable names (str) to output values. - * If false, the full script is run on each transaction and the output variables are obtained from the global namespace - * after execution. - */ - public PythonJob(@Nonnull String name, @Nonnull String code, boolean setupRunMode) throws Exception { - this.name = name; - this.code = code; - this.setupRunMode = setupRunMode; - context = "__job_" + name; - if (PythonContextManager.hasContext(context)) { - throw new PythonException("Unable to create python job " + name + ". Context " + context + " already exists!"); - } - if (setupRunMode) setup(); - } - - - /** - * Clears all variables in current context and calls setup() - */ - public void clearState() throws Exception { - String context = this.context; - PythonContextManager.setContext("main"); - PythonContextManager.deleteContext(context); - this.context = context; - setup(); - } - - public void setup() throws Exception { - try (PythonGIL gil = PythonGIL.lock()) { - PythonContextManager.setContext(context); - PythonObject runF = PythonExecutioner.getVariable("run"); - if (runF.isNone() || !Python.callable(runF)) { - PythonExecutioner.exec(code); - runF = PythonExecutioner.getVariable("run"); - } - if (runF.isNone() || !Python.callable(runF)) { - throw new PythonException("run() method not found! " + - "If a PythonJob is created with 'setup and run' " + - "mode enabled, the associated python code is " + - "expected to contain a run() method " + - "(with or without arguments)."); - } - this.runF = runF; - PythonObject setupF = PythonExecutioner.getVariable("setup"); - if (!setupF.isNone()) { - setupF.call(); - } - } - } - - public void exec(PythonVariables inputs, PythonVariables outputs) throws Exception { - try (PythonGIL gil = PythonGIL.lock()) { - PythonContextManager.setContext(context); - if (!setupRunMode) { - PythonExecutioner.exec(code, inputs, outputs); - return; - } - PythonExecutioner.setVariables(inputs); - - PythonObject inspect = Python.importModule("inspect"); - PythonObject getfullargspec = inspect.attr("getfullargspec"); - PythonObject argspec = getfullargspec.call(runF); - PythonObject argsList = argspec.attr("args"); - PythonObject runargs = Python.dict(); - int argsCount = Python.len(argsList).toInt(); - for (int i = 0; i < argsCount; i++) { - PythonObject arg = argsList.get(i); - PythonObject val = Python.globals().get(arg); - if (val.isNone()) { - throw new PythonException("Input value not received for run() argument: " + arg.toString()); - } - runargs.set(arg, val); - } - PythonObject outDict = runF.callWithKwargs(runargs); - Python.globals().attr("update").call(outDict); - - PythonExecutioner.getVariables(outputs); - inspect.del(); - getfullargspec.del(); - argspec.del(); - runargs.del(); - } - } - - public PythonVariables execAndReturnAllVariables(PythonVariables inputs) throws Exception { - try (PythonGIL gil = PythonGIL.lock()) { - PythonContextManager.setContext(context); - if (!setupRunMode) { - return PythonExecutioner.execAndReturnAllVariables(code, inputs); - } - PythonExecutioner.setVariables(inputs); - PythonObject inspect = Python.importModule("inspect"); - PythonObject getfullargspec = inspect.attr("getfullargspec"); - PythonObject argspec = getfullargspec.call(runF); - PythonObject argsList = argspec.attr("args"); - PythonObject runargs = Python.dict(); - int argsCount = Python.len(argsList).toInt(); - for (int i = 0; i < argsCount; i++) { - PythonObject arg = argsList.get(i); - PythonObject val = Python.globals().get(arg); - if (val.isNone()) { - throw new PythonException("Input value not received for run() argument: " + arg.toString()); - } - runargs.set(arg, val); - } - PythonObject outDict = runF.callWithKwargs(runargs); - Python.globals().attr("update").call(outDict); - inspect.del(); - getfullargspec.del(); - argspec.del(); - runargs.del(); - return PythonExecutioner.getAllVariables(); - } - } - - -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonObject.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonObject.java deleted file mode 100644 index 47b5bb2e8..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonObject.java +++ /dev/null @@ -1,590 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - - -package org.datavec.python; - - -import lombok.extern.slf4j.Slf4j; -import org.bytedeco.cpython.PyObject; -import org.bytedeco.javacpp.BytePointer; -import org.bytedeco.javacpp.Pointer; -import org.bytedeco.javacpp.SizeTPointer; -import org.bytedeco.numpy.PyArrayObject; -import org.json.JSONArray; -import org.json.JSONObject; -import org.nd4j.linalg.api.buffer.BaseDataBuffer; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.nativeblas.NativeOpsHolder; - -import java.util.*; - -import static org.bytedeco.cpython.global.python.*; -import static org.bytedeco.numpy.global.numpy.*; - -/** - * Swift like python wrapper for J - * - * @author Fariz Rahman - */ - -@Slf4j -public class PythonObject { - private PyObject nativePythonObject; - - static { - new PythonExecutioner(); - } - - private static Map _getNDArraySerializer() { - Map ndarraySerializer = new HashMap<>(); - PythonObject lambda = Python.eval( - "lambda x: " + - "{'address':" + - "x.__array_interface__['data'][0]," + - "'shape':x.shape,'strides':x.strides," + - "'dtype': str(x.dtype),'_is_numpy_array': True}" + - " if str(type(x))== \"\" else x"); - ndarraySerializer.put("default", - lambda); - return ndarraySerializer; - - } - - public PythonObject(PyObject pyObject) { - nativePythonObject = pyObject; - } - - public PythonObject(INDArray npArray) { - this(new NumpyArray(npArray)); - } - - public PythonObject(BytePointer bp){ - - long address = bp.address(); - long size = bp.capacity(); - NumpyArray npArr = NumpyArray.builder().address(address).shape(new long[]{size}).strides(new long[]{1}).dtype(DataType.INT8).build(); - nativePythonObject = Python.memoryview(new PythonObject(npArr)).nativePythonObject; - } - - public PythonObject(NumpyArray npArray) { - int numpyType; - INDArray indArray = npArray.getNd4jArray(); - DataType dataType = indArray.dataType(); - - switch (dataType) { - case DOUBLE: - numpyType = NPY_DOUBLE; - break; - case FLOAT: - case BFLOAT16: - numpyType = NPY_FLOAT; - break; - case SHORT: - numpyType = NPY_SHORT; - break; - case INT: - numpyType = NPY_INT; - break; - case LONG: - numpyType = NPY_INT64; - break; - case UINT16: - numpyType = NPY_USHORT; - break; - case UINT32: - numpyType = NPY_UINT; - break; - case UINT64: - numpyType = NPY_UINT64; - break; - case BOOL: - numpyType = NPY_BOOL; - break; - case BYTE: - numpyType = NPY_BYTE; - break; - case UBYTE: - numpyType = NPY_UBYTE; - break; - case HALF: - numpyType = NPY_HALF; - break; - default: - throw new RuntimeException("Unsupported dtype: " + npArray.getDtype()); - } - - long[] shape = indArray.shape(); - INDArray inputArray = indArray; - if(dataType == DataType.BFLOAT16) { - log.warn("\n\nThe given nd4j array \n\n{}\n\n is of BFLOAT16 datatype. " + - "Casting a copy of it to FLOAT and creating the respective numpy array from it.\n", indArray); - inputArray = indArray.castTo(DataType.FLOAT); - } - - //Sync to host memory in the case of CUDA, before passing the host memory pointer to Python - if(inputArray.data() instanceof BaseDataBuffer){ - ((BaseDataBuffer)inputArray.data()).syncToPrimary(); - } - - nativePythonObject = PyArray_New(PyArray_Type(), shape.length, new SizeTPointer(shape), - numpyType, null, - inputArray.data().addressPointer(), - 0, NPY_ARRAY_CARRAY, null); - - } - - /*---primitve constructors---*/ - public PyObject getNativePythonObject() { - return nativePythonObject; - } - - public PythonObject(String data) { - nativePythonObject = PyUnicode_FromString(data); - } - - public PythonObject(int data) { - nativePythonObject = PyLong_FromLong((long) data); - } - - public PythonObject(long data) { - nativePythonObject = PyLong_FromLong(data); - } - - public PythonObject(double data) { - nativePythonObject = PyFloat_FromDouble(data); - } - - public PythonObject(boolean data) { - nativePythonObject = PyBool_FromLong(data ? 1 : 0); - } - - private static PythonObject j2pyObject(Object item) { - if (item instanceof PythonObject) { - return (PythonObject) item; - } else if (item instanceof PyObject) { - return new PythonObject((PyObject) item); - } else if (item instanceof INDArray) { - return new PythonObject((INDArray) item); - } else if (item instanceof NumpyArray) { - return new PythonObject((NumpyArray) item); - } else if (item instanceof List) { - return new PythonObject((List) item); - } else if (item instanceof Object[]) { - return new PythonObject((Object[]) item); - } else if (item instanceof Map) { - return new PythonObject((Map) item); - } else if (item instanceof String) { - return new PythonObject((String) item); - } else if (item instanceof Double) { - return new PythonObject((Double) item); - } else if (item instanceof Float) { - return new PythonObject((Float) item); - } else if (item instanceof Long) { - return new PythonObject((Long) item); - } else if (item instanceof Integer) { - return new PythonObject((Integer) item); - } else if (item instanceof Boolean) { - return new PythonObject((Boolean) item); - } else if (item instanceof Pointer){ - return new PythonObject(new BytePointer((Pointer)item)); - } else { - throw new RuntimeException("Unsupported item in list: " + item); - } - } - - public PythonObject(Object[] data) { - PyObject pyList = PyList_New((long) data.length); - for (int i = 0; i < data.length; i++) { - PyList_SetItem(pyList, i, j2pyObject(data[i]).nativePythonObject); - } - nativePythonObject = pyList; - } - - public PythonObject(List data) { - PyObject pyList = PyList_New((long) data.size()); - for (int i = 0; i < data.size(); i++) { - PyList_SetItem(pyList, i, j2pyObject(data.get(i)).nativePythonObject); - } - nativePythonObject = pyList; - } - - public PythonObject(Map data) { - PyObject pyDict = PyDict_New(); - for (Object k : data.keySet()) { - PythonObject pyKey; - if (k instanceof PythonObject) { - pyKey = (PythonObject) k; - } else if (k instanceof String) { - pyKey = new PythonObject((String) k); - } else if (k instanceof Double) { - pyKey = new PythonObject((Double) k); - } else if (k instanceof Float) { - pyKey = new PythonObject((Float) k); - } else if (k instanceof Long) { - pyKey = new PythonObject((Long) k); - } else if (k instanceof Integer) { - pyKey = new PythonObject((Integer) k); - } else if (k instanceof Boolean) { - pyKey = new PythonObject((Boolean) k); - } else { - throw new RuntimeException("Unsupported key in map: " + k.getClass()); - } - Object v = data.get(k); - PythonObject pyVal; - if (v instanceof PythonObject) { - pyVal = (PythonObject) v; - } else if (v instanceof PyObject) { - pyVal = new PythonObject((PyObject) v); - } else if (v instanceof INDArray) { - pyVal = new PythonObject((INDArray) v); - } else if (v instanceof NumpyArray) { - pyVal = new PythonObject((NumpyArray) v); - } else if (v instanceof Map) { - pyVal = new PythonObject((Map) v); - } else if (v instanceof List) { - pyVal = new PythonObject((List) v); - } else if (v instanceof String) { - pyVal = new PythonObject((String) v); - } else if (v instanceof Double) { - pyVal = new PythonObject((Double) v); - } else if (v instanceof Float) { - pyVal = new PythonObject((Float) v); - } else if (v instanceof Long) { - pyVal = new PythonObject((Long) v); - } else if (v instanceof Integer) { - pyVal = new PythonObject((Integer) v); - } else if (v instanceof Boolean) { - pyVal = new PythonObject((Boolean) v); - } else { - throw new RuntimeException("Unsupported value in map: " + k.getClass()); - } - - PyDict_SetItem(pyDict, pyKey.nativePythonObject, pyVal.nativePythonObject); - - } - nativePythonObject = pyDict; - } - - - /*------*/ - - private static String pyObjectToString(PyObject pyObject) { - PyObject repr = PyObject_Str(pyObject); - PyObject str = PyUnicode_AsEncodedString(repr, "utf-8", "~E~"); - String jstr = PyBytes_AsString(str).getString(); - Py_DecRef(repr); - Py_DecRef(str); - return jstr; - } - - public String toString() { - return pyObjectToString(nativePythonObject); - } - - public double toDouble() { - return PyFloat_AsDouble(nativePythonObject); - } - - public float toFloat() { - return (float) PyFloat_AsDouble(nativePythonObject); - } - - public int toInt() { - return (int) PyLong_AsLong(nativePythonObject); - } - - public long toLong() { - return PyLong_AsLong(nativePythonObject); - } - - public boolean toBoolean() { - if (isNone()) return false; - return toInt() != 0; - } - - public NumpyArray toNumpy() throws PythonException{ - PyObject np = PyImport_ImportModule("numpy"); - PyObject ndarray = PyObject_GetAttrString(np, "ndarray"); - if (PyObject_IsInstance(nativePythonObject, ndarray) != 1){ - throw new PythonException("Object is not a numpy array! Use Python.ndarray() to convert object to a numpy array."); - } - Py_DecRef(ndarray); - Py_DecRef(np); - - Pointer objPtr = new Pointer(nativePythonObject); - PyArrayObject npArr = new PyArrayObject(objPtr); - Pointer ptr = PyArray_DATA(npArr); - long[] shape = new long[PyArray_NDIM(npArr)]; - SizeTPointer shapePtr = PyArray_SHAPE(npArr); - if (shapePtr != null) - shapePtr.get(shape, 0, shape.length); - long[] strides = new long[shape.length]; - SizeTPointer stridesPtr = PyArray_STRIDES(npArr); - if (stridesPtr != null) - stridesPtr.get(strides, 0, strides.length); - int npdtype = PyArray_TYPE(npArr); - - DataType dtype; - switch (npdtype){ - case NPY_DOUBLE: - dtype = DataType.DOUBLE; break; - case NPY_FLOAT: - dtype = DataType.FLOAT; break; - case NPY_SHORT: - dtype = DataType.SHORT; break; - case NPY_INT: - dtype = DataType.INT32; break; - case NPY_LONG: - dtype = DataType.LONG; break; - case NPY_UINT: - dtype = DataType.UINT32; break; - case NPY_BYTE: - dtype = DataType.INT8; break; - case NPY_UBYTE: - dtype = DataType.UINT8; break; - case NPY_BOOL: - dtype = DataType.BOOL; break; - case NPY_HALF: - dtype = DataType.FLOAT16; break; - case NPY_LONGLONG: - dtype = DataType.INT64; break; - case NPY_USHORT: - dtype = DataType.UINT16; break; - case NPY_ULONG: - case NPY_ULONGLONG: - dtype = DataType.UINT64; break; - default: - throw new PythonException("Unsupported array data type: " + npdtype); - } - - return new NumpyArray(ptr.address(), shape, strides, dtype); - - } - - public PythonObject attr(String attr) { - - return new PythonObject(PyObject_GetAttrString(nativePythonObject, attr)); - } - - public PythonObject call(Object... args) { - if (args.length > 0 && args[args.length - 1] instanceof Map) { - List args2 = new ArrayList<>(); - for (int i = 0; i < args.length - 1; i++) { - args2.add(args[i]); - } - return call(args2, (Map) args[args.length - 1]); - } - if (args.length == 0) { - return new PythonObject(PyObject_CallObject(nativePythonObject, null)); - } - PyObject tuple = PyTuple_New(args.length); // leaky; tuple may contain borrowed references, so can not be de-allocated. - for (int i = 0; i < args.length; i++) { - PyTuple_SetItem(tuple, i, j2pyObject(args[i]).nativePythonObject); - } - PythonObject ret = new PythonObject(PyObject_Call(nativePythonObject, tuple, null)); - return ret; - } - - public PythonObject callWithArgs(PythonObject args) { - PyObject tuple = PyList_AsTuple(args.nativePythonObject); - return new PythonObject(PyObject_Call(nativePythonObject, tuple, null)); - } - - public PythonObject callWithKwargs(PythonObject kwargs) { - PyObject tuple = PyTuple_New(0); - return new PythonObject(PyObject_Call(nativePythonObject, tuple, kwargs.nativePythonObject)); - } - - public PythonObject callWithArgsAndKwargs(PythonObject args, PythonObject kwargs) { - PyObject tuple = PyList_AsTuple(args.nativePythonObject); - PyObject dict = kwargs.nativePythonObject; - return new PythonObject(PyObject_Call(nativePythonObject, tuple, dict)); - } - - public PythonObject call(Map kwargs) { - PyObject dict = new PythonObject(kwargs).nativePythonObject; - PyObject tuple = PyTuple_New(0); - return new PythonObject(PyObject_Call(nativePythonObject, tuple, dict)); - } - - public PythonObject call(List args) { - PyObject tuple = PyList_AsTuple(new PythonObject(args).nativePythonObject); - return new PythonObject(PyObject_Call(nativePythonObject, tuple, null)); - } - - public PythonObject call(List args, Map kwargs) { - PyObject tuple = PyList_AsTuple(new PythonObject(args).nativePythonObject); - PyObject dict = new PythonObject(kwargs).nativePythonObject; - return new PythonObject(PyObject_Call(nativePythonObject, tuple, dict)); - } - - private PythonObject get(PyObject key) { - return new PythonObject( - PyObject_GetItem(nativePythonObject, key) - ); - } - - public PythonObject get(PythonObject key) { - return get(key.nativePythonObject); - } - - public PythonObject get(int key) { - return get(PyLong_FromLong((long) key)); - } - - public PythonObject get(long key) { - return new PythonObject( - PyObject_GetItem(nativePythonObject, PyLong_FromLong(key)) - ); - } - - public PythonObject get(double key) { - return new PythonObject( - PyObject_GetItem(nativePythonObject, PyFloat_FromDouble(key)) - ); - } - - public PythonObject get(String key) { - return get(new PythonObject(key)); - } - - public void set(PythonObject key, PythonObject value) { - PyObject_SetItem(nativePythonObject, key.nativePythonObject, value.nativePythonObject); - } - - public void del() { - Py_DecRef(nativePythonObject); - nativePythonObject = null; - } - - public JSONArray toJSONArray() throws PythonException { - PythonObject json = Python.importModule("json"); - PythonObject serialized = json.attr("dumps").call(this, _getNDArraySerializer()); - String jsonString = serialized.toString(); - return new JSONArray(jsonString); - } - - public JSONObject toJSONObject() throws PythonException { - PythonObject json = Python.importModule("json"); - PythonObject serialized = json.attr("dumps").call(this, _getNDArraySerializer()); - String jsonString = serialized.toString(); - return new JSONObject(jsonString); - } - - public List toList() throws PythonException{ - List list = new ArrayList(); - int n = Python.len(this).toInt(); - for (int i = 0; i < n; i++) { - PythonObject o = get(i); - if (Python.isinstance(o, Python.strType())) { - list.add(o.toString()); - } else if (Python.isinstance(o, Python.intType())) { - list.add(o.toLong()); - } else if (Python.isinstance(o, Python.floatType())) { - list.add(o.toDouble()); - } else if (Python.isinstance(o, Python.boolType())) { - list.add(o); - } else if (Python.isinstance(o, Python.listType(), Python.tupleType())) { - list.add(o.toList()); - } else if (Python.isinstance(o, Python.importModule("numpy").attr("ndarray"))) { - list.add(o.toNumpy().getNd4jArray()); - } else if (Python.isinstance(o, Python.dictType())) { - list.add(o.toMap()); - } else { - throw new RuntimeException("Error while converting python" + - " list to java List: Unable to serialize python " + - "object of type " + Python.type(this).toString()); - } - } - - return list; - } - - public Map toMap() throws PythonException{ - Map map = new HashMap(); - List keys = Python.list(attr("keys").call()).toList(); - List values = Python.list(attr("values").call()).toList(); - for (int i = 0; i < keys.size(); i++) { - map.put(keys.get(i), values.get(i)); - } - return map; - } - - public BytePointer toBytePointer() throws PythonException{ - if (Python.isinstance(this, Python.bytesType())){ - PyObject byteArray = PyByteArray_FromObject(nativePythonObject); - return PyByteArray_AsString(byteArray); - - } - else if (Python.isinstance(this, Python.bytearrayType())){ - return PyByteArray_AsString(nativePythonObject); - } - else if (Python.isinstance(this, Python.memoryviewType())){ - -// PyObject np = PyImport_ImportModule("numpy"); -// PyObject array = PyObject_GetAttrString(np, "asarray"); -// PyObject npArr = PyObject_CallObject(array, nativePythonObject); // Doesn't work - // Invoke interpreter: - String tempContext = "temp" + UUID.randomUUID().toString().replace('-', '_'); - String originalContext = Python.getCurrentContext(); - Python.setContext(tempContext); - PythonExecutioner.setVariable("memview", this); - PythonExecutioner.exec("import numpy as np\narr = np.frombuffer(memview, dtype='int8')"); - INDArray arr = PythonExecutioner.getVariable("arr").toNumpy().getNd4jArray(); - if(arr.data() instanceof BaseDataBuffer){ - ((BaseDataBuffer)arr.data()).syncToPrimary(); - } - BytePointer ret = new BytePointer(arr.data().pointer()); - Python.setContext(originalContext); - Python.deleteContext(tempContext); - return ret; - } else { - PyObject ctypes = PyImport_ImportModule("ctypes"); - PyObject cArrType = PyObject_GetAttrString(ctypes, "Array"); - if (PyObject_IsInstance(nativePythonObject, cArrType) != 0){ - PyObject cVoidP = PyObject_GetAttrString(ctypes, "c_void_p"); - PyObject cast = PyObject_GetAttrString(ctypes, "cast"); - PyObject argsTuple = PyTuple_New(2); - PyTuple_SetItem(argsTuple, 0, nativePythonObject); - PyTuple_SetItem(argsTuple, 1, cVoidP); - PyObject voidPtr = PyObject_Call(cast, argsTuple, null); - PyObject pyAddress = PyObject_GetAttrString(voidPtr, "value"); - long address = PyLong_AsLong(pyAddress); - long size = PyObject_Size(nativePythonObject); - Py_DecRef(ctypes); - Py_DecRef(cArrType); - Py_DecRef(argsTuple); - Py_DecRef(voidPtr); - Py_DecRef(pyAddress); - Pointer ptr = NativeOpsHolder.getInstance().getDeviceNativeOps().pointerForAddress(address); - ptr = ptr.limit(size); - ptr = ptr.capacity(size); - return new BytePointer(ptr); - } - else{ - throw new PythonException("Expected bytes, bytearray, memoryview or ctypesArray. Received " + Python.type(this).toString()); - } - } - } - public boolean isNone() { - return (nativePythonObject == null)|| - (toString().equals("None") && Python.type(this).toString().equals("")); - } -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonProcess.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonProcess.java deleted file mode 100644 index 54036243b..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonProcess.java +++ /dev/null @@ -1,130 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - - -package org.datavec.python; - -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.io.IOUtils; -import org.bytedeco.javacpp.Loader; - -import java.io.IOException; -import java.nio.charset.StandardCharsets; -import java.util.Arrays; - -@Slf4j -public class PythonProcess { - private static String pythonExecutable = Loader.load(org.bytedeco.cpython.python.class); - public static String runAndReturn(String... arguments)throws IOException, InterruptedException{ - String[] allArgs = new String[arguments.length + 1]; - System.arraycopy(arguments, 0, allArgs, 1, arguments.length); - allArgs[0] = pythonExecutable; - log.info("Executing command: " + Arrays.toString(allArgs)); - ProcessBuilder pb = new ProcessBuilder(allArgs); - Process process = pb.start(); - String out = IOUtils.toString(process.getInputStream(), StandardCharsets.UTF_8); - process.waitFor(); - return out; - - } - - public static void run(String... arguments)throws IOException, InterruptedException{ - String[] allArgs = new String[arguments.length + 1]; - System.arraycopy(arguments, 0, allArgs, 1, arguments.length); - allArgs[0] = pythonExecutable; - log.info("Executing command: " + Arrays.toString(allArgs)); - ProcessBuilder pb = new ProcessBuilder(allArgs); - pb.inheritIO().start().waitFor(); - } - public static void pipInstall(String packageName) throws PythonException{ - try{ - run("-m", "pip", "install", packageName); - }catch(Exception e){ - throw new PythonException("Error installing package " + packageName, e); - } - - } - - public static void pipInstall(String packageName, String version) throws PythonException{ - pipInstall(packageName + "==" + version); - } - - public static void pipUninstall(String packageName) throws PythonException{ - try{ - run("-m", "pip", "uninstall", packageName); - }catch(Exception e){ - throw new PythonException("Error uninstalling package " + packageName, e); - } - - } - public static void pipInstallFromGit(String gitRepoUrl) throws PythonException{ - if (!gitRepoUrl.contains("://")){ - gitRepoUrl = "git://" + gitRepoUrl; - } - try{ - run("-m", "pip", "install", "git+", gitRepoUrl); - }catch(Exception e){ - throw new PythonException("Error installing package from " + gitRepoUrl, e); - } - - } - - public static String getPackageVersion(String packageName) throws PythonException{ - String out; - try{ - out = runAndReturn("-m", "pip", "show", packageName); - } catch (Exception e){ - throw new PythonException("Error finding version for package " + packageName, e); - } - - if (!out.contains("Version: ")){ - throw new PythonException("Can't find package " + packageName); - } - String pkgVersion = out.split("Version: ")[1].split(System.lineSeparator())[0]; - return pkgVersion; - } - - public static boolean isPackageInstalled(String packageName)throws PythonException{ - try{ - String out = runAndReturn("-m", "pip", "show", packageName); - return !out.isEmpty(); - }catch (Exception e){ - throw new PythonException("Error checking if package is installed: " +packageName, e); - } - - } - - public static void pipInstallFromRequirementsTxt(String path) throws PythonException{ - try{ - run("-m", "pip", "install","-r", path); - }catch (Exception e){ - throw new PythonException("Error installing packages from " + path, e); - } - } - - public static void pipInstallFromSetupScript(String path, boolean inplace) throws PythonException{ - - try{ - run(path, inplace?"develop":"install"); - }catch (Exception e){ - throw new PythonException("Error installing package from " + path, e); - } - - } - -} \ No newline at end of file diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonTransform.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonTransform.java deleted file mode 100644 index d35ee501e..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonTransform.java +++ /dev/null @@ -1,332 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import lombok.Builder; -import lombok.Data; -import lombok.NoArgsConstructor; -import org.datavec.api.transform.Transform; -import org.datavec.api.transform.schema.Schema; -import org.datavec.api.writable.*; -import org.nd4j.common.base.Preconditions; -import org.nd4j.common.holder.ObjectMapperHolder; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.shade.jackson.core.JsonProcessingException; - -import java.util.ArrayList; -import java.util.List; -import java.util.Map; -import java.util.UUID; - -import static org.datavec.python.PythonUtils.schemaToPythonVariables; - -/** - * Row-wise Transform that applies arbitrary python code on each row - * - * @author Fariz Rahman - */ - -@NoArgsConstructor -@Data -public class PythonTransform implements Transform { - - private String code; - private PythonVariables inputs; - private PythonVariables outputs; - private String name = UUID.randomUUID().toString(); - private Schema inputSchema; - private Schema outputSchema; - private String outputDict; - private boolean returnAllVariables; - private boolean setupAndRun = false; - private PythonJob pythonJob; - - - @Builder - public PythonTransform(String code, - PythonVariables inputs, - PythonVariables outputs, - String name, - Schema inputSchema, - Schema outputSchema, - String outputDict, - boolean returnAllInputs, - boolean setupAndRun) { - Preconditions.checkNotNull(code, "No code found to run!"); - this.code = code; - this.returnAllVariables = returnAllInputs; - this.setupAndRun = setupAndRun; - if (inputs != null) - this.inputs = inputs; - if (outputs != null) - this.outputs = outputs; - if (name != null) - this.name = name; - if (outputDict != null) { - this.outputDict = outputDict; - this.outputs = new PythonVariables(); - this.outputs.addDict(outputDict); - } - - try { - if (inputSchema != null) { - this.inputSchema = inputSchema; - if (inputs == null || inputs.isEmpty()) { - this.inputs = schemaToPythonVariables(inputSchema); - } - } - - if (outputSchema != null) { - this.outputSchema = outputSchema; - if (outputs == null || outputs.isEmpty()) { - this.outputs = schemaToPythonVariables(outputSchema); - } - } - } catch (Exception e) { - throw new IllegalStateException(e); - } - try{ - pythonJob = PythonJob.builder() - .name("a" + UUID.randomUUID().toString().replace("-", "_")) - .code(code) - .setupRunMode(setupAndRun) - .build(); - } - catch(Exception e){ - throw new IllegalStateException("Error creating python job: " + e); - } - - } - - - @Override - public void setInputSchema(Schema inputSchema) { - Preconditions.checkNotNull(inputSchema, "No input schema found!"); - this.inputSchema = inputSchema; - try { - inputs = schemaToPythonVariables(inputSchema); - } catch (Exception e) { - throw new RuntimeException(e); - } - if (outputSchema == null && outputDict == null) { - outputSchema = inputSchema; - } - - } - - @Override - public Schema getInputSchema() { - return inputSchema; - } - - @Override - public List> mapSequence(List> sequence) { - List> out = new ArrayList<>(); - for (List l : sequence) { - out.add(map(l)); - } - return out; - } - - @Override - public Object map(Object input) { - throw new UnsupportedOperationException("Not yet implemented"); - } - - @Override - public Object mapSequence(Object sequence) { - throw new UnsupportedOperationException("Not yet implemented"); - } - - - @Override - public List map(List writables) { - PythonVariables pyInputs = getPyInputsFromWritables(writables); - Preconditions.checkNotNull(pyInputs, "Inputs must not be null!"); - try { - if (returnAllVariables) { - return getWritablesFromPyOutputs(pythonJob.execAndReturnAllVariables(pyInputs)); - } - - if (outputDict != null) { - pythonJob.exec(pyInputs, outputs); - PythonVariables out = PythonUtils.expandInnerDict(outputs, outputDict); - return getWritablesFromPyOutputs(out); - } else { - pythonJob.exec(pyInputs, outputs); - - return getWritablesFromPyOutputs(outputs); - } - - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public String[] outputColumnNames() { - return outputs.getVariables(); - } - - @Override - public String outputColumnName() { - return outputColumnNames()[0]; - } - - @Override - public String[] columnNames() { - return outputs.getVariables(); - } - - @Override - public String columnName() { - return columnNames()[0]; - } - - public Schema transform(Schema inputSchema) { - return outputSchema; - } - - - private PythonVariables getPyInputsFromWritables(List writables) { - PythonVariables ret = new PythonVariables(); - - for (String name : inputs.getVariables()) { - int colIdx = inputSchema.getIndexOfColumn(name); - Writable w = writables.get(colIdx); - PythonType pyType = inputs.getType(name); - switch (pyType.getName()) { - case INT: - if (w instanceof LongWritable) { - ret.addInt(name, ((LongWritable) w).get()); - } else { - ret.addInt(name, ((IntWritable) w).get()); - } - break; - case FLOAT: - if (w instanceof DoubleWritable) { - ret.addFloat(name, ((DoubleWritable) w).get()); - } else { - ret.addFloat(name, ((FloatWritable) w).get()); - } - break; - case STR: - ret.addStr(name, w.toString()); - break; - case NDARRAY: - ret.addNDArray(name, ((NDArrayWritable) w).get()); - break; - case BOOL: - ret.addBool(name, ((BooleanWritable) w).get()); - break; - default: - throw new RuntimeException("Unsupported input type:" + pyType); - } - - } - return ret; - } - - private List getWritablesFromPyOutputs(PythonVariables pyOuts) { - List out = new ArrayList<>(); - String[] varNames; - varNames = pyOuts.getVariables(); - Schema.Builder schemaBuilder = new Schema.Builder(); - for (int i = 0; i < varNames.length; i++) { - String name = varNames[i]; - PythonType pyType = pyOuts.getType(name); - switch (pyType.getName()) { - case INT: - schemaBuilder.addColumnLong(name); - break; - case FLOAT: - schemaBuilder.addColumnDouble(name); - break; - case STR: - case DICT: - case LIST: - schemaBuilder.addColumnString(name); - break; - case NDARRAY: - INDArray arr = pyOuts.getNDArrayValue(name); - schemaBuilder.addColumnNDArray(name, arr.shape()); - break; - case BOOL: - schemaBuilder.addColumnBoolean(name); - break; - default: - throw new IllegalStateException("Unable to support type " + pyType.getName()); - } - } - this.outputSchema = schemaBuilder.build(); - - - for (int i = 0; i < varNames.length; i++) { - String name = varNames[i]; - PythonType pyType = pyOuts.getType(name); - - switch (pyType.getName()) { - case INT: - out.add(new LongWritable(pyOuts.getIntValue(name))); - break; - case FLOAT: - out.add(new DoubleWritable(pyOuts.getFloatValue(name))); - break; - case STR: - out.add(new Text(pyOuts.getStrValue(name))); - break; - case NDARRAY: - INDArray arr = pyOuts.getNDArrayValue(name); - out.add(new NDArrayWritable(arr)); - break; - case DICT: - Map dictValue = pyOuts.getDictValue(name); - Map noNullValues = new java.util.HashMap<>(); - for (Map.Entry entry : dictValue.entrySet()) { - if (entry.getValue() != org.json.JSONObject.NULL) { - noNullValues.put(entry.getKey(), entry.getValue()); - } - } - - try { - out.add(new Text(ObjectMapperHolder.getJsonMapper().writeValueAsString(noNullValues))); - } catch (JsonProcessingException e) { - throw new IllegalStateException("Unable to serialize dictionary " + name + " to json!"); - } - break; - case LIST: - Object[] listValue = pyOuts.getListValue(name).toArray(); - try { - out.add(new Text(ObjectMapperHolder.getJsonMapper().writeValueAsString(listValue))); - } catch (JsonProcessingException e) { - throw new IllegalStateException("Unable to serialize list vlaue " + name + " to json!"); - } - break; - case BOOL: - out.add(new BooleanWritable(pyOuts.getBooleanValue(name))); - break; - default: - throw new IllegalStateException("Unable to support type " + pyType.getName()); - } - } - return out; - } - - -} \ No newline at end of file diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonType.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonType.java deleted file mode 100644 index 7d3d25e5c..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonType.java +++ /dev/null @@ -1,240 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.bytedeco.javacpp.BytePointer; -import org.bytedeco.javacpp.Pointer; -import org.nd4j.linalg.api.ndarray.INDArray; - -import java.util.Arrays; -import java.util.List; -import java.util.Map; - -import static org.datavec.python.Python.importModule; - - -/** - * - * @param Corresponding Java type for the Python type - */ -public abstract class PythonType { - - public abstract T toJava(PythonObject pythonObject) throws PythonException; - private final TypeName typeName; - - public enum TypeName{ - STR, - INT, - FLOAT, - BOOL, - LIST, - DICT, - NDARRAY, - BYTES - } - private PythonType(TypeName typeName){ - this.typeName = typeName; - } - public TypeName getName(){return typeName;} - public String toString(){ - return getName().name(); - } - public static PythonType valueOf(String typeName) throws PythonException{ - try{ - typeName.valueOf(typeName); - } catch (IllegalArgumentException iae){ - throw new PythonException("Invalid python type: " + typeName, iae); - } - try{ - return (PythonType)PythonType.class.getField(typeName).get(null); // shouldn't fail - } catch (Exception e){ - throw new RuntimeException(e); - } - - } - public static PythonType valueOf(TypeName typeName){ - try{ - return valueOf(typeName.name()); // shouldn't fail - }catch (PythonException pe){ - throw new RuntimeException(pe); - } - } - - /** - * Since multiple java types can map to the same python type, - * this method "normalizes" all supported incoming objects to T - * - * @param object object to be converted to type T - * @return - */ - public T convert(Object object) throws PythonException { - return (T) object; - } - - public static final PythonType STR = new PythonType(TypeName.STR) { - @Override - public String toJava(PythonObject pythonObject) throws PythonException { - if (!Python.isinstance(pythonObject, Python.strType())) { - throw new PythonException("Expected variable to be str, but was " + Python.type(pythonObject)); - } - return pythonObject.toString(); - } - - @Override - public String convert(Object object) { - return object.toString(); - } - }; - - public static final PythonType INT = new PythonType(TypeName.INT) { - @Override - public Long toJava(PythonObject pythonObject) throws PythonException { - if (!Python.isinstance(pythonObject, Python.intType())) { - throw new PythonException("Expected variable to be int, but was " + Python.type(pythonObject)); - } - return pythonObject.toLong(); - } - - @Override - public Long convert(Object object) throws PythonException { - if (object instanceof Number) { - return ((Number) object).longValue(); - } - throw new PythonException("Unable to cast " + object + " to Long."); - } - }; - - public static final PythonType FLOAT = new PythonType(TypeName.FLOAT) { - @Override - public Double toJava(PythonObject pythonObject) throws PythonException { - if (!Python.isinstance(pythonObject, Python.floatType())) { - throw new PythonException("Expected variable to be float, but was " + Python.type(pythonObject)); - } - return pythonObject.toDouble(); - } - - @Override - public Double convert(Object object) throws PythonException { - if (object instanceof Number) { - return ((Number) object).doubleValue(); - } - throw new PythonException("Unable to cast " + object + " to Double."); - } - }; - - public static final PythonType BOOL = new PythonType(TypeName.BOOL) { - @Override - public Boolean toJava(PythonObject pythonObject) throws PythonException { - if (!Python.isinstance(pythonObject, Python.boolType())) { - throw new PythonException("Expected variable to be bool, but was " + Python.type(pythonObject)); - } - return pythonObject.toBoolean(); - } - - @Override - public Boolean convert(Object object) throws PythonException { - if (object instanceof Number) { - return ((Number) object).intValue() != 0; - } else if (object instanceof Boolean) { - return (Boolean) object; - } - throw new PythonException("Unable to cast " + object + " to Boolean."); - } - }; - - public static final PythonType LIST = new PythonType(TypeName.LIST) { - @Override - public List toJava(PythonObject pythonObject) throws PythonException { - if (!Python.isinstance(pythonObject, Python.listType())) { - throw new PythonException("Expected variable to be list, but was " + Python.type(pythonObject)); - } - return pythonObject.toList(); - } - - @Override - public List convert(Object object) throws PythonException { - if (object instanceof java.util.List) { - return (List) object; - } else if (object instanceof org.json.JSONArray) { - org.json.JSONArray jsonArray = (org.json.JSONArray) object; - return jsonArray.toList(); - - } else if (object instanceof Object[]) { - return Arrays.asList((Object[]) object); - } - throw new PythonException("Unable to cast " + object + " to List."); - } - }; - - public static final PythonType DICT = new PythonType(TypeName.DICT) { - @Override - public Map toJava(PythonObject pythonObject) throws PythonException { - if (!Python.isinstance(pythonObject, Python.dictType())) { - throw new PythonException("Expected variable to be dict, but was " + Python.type(pythonObject)); - } - return pythonObject.toMap(); - } - - @Override - public Map convert(Object object) throws PythonException { - if (object instanceof Map) { - return (Map) object; - } - throw new PythonException("Unable to cast " + object + " to Map."); - } - }; - - public static final PythonType NDARRAY = new PythonType(TypeName.NDARRAY) { - @Override - public INDArray toJava(PythonObject pythonObject) throws PythonException { - PythonObject np = importModule("numpy"); - if (!Python.isinstance(pythonObject, np.attr("ndarray"), np.attr("generic"))) { - throw new PythonException("Expected variable to be numpy.ndarray, but was " + Python.type(pythonObject)); - } - return pythonObject.toNumpy().getNd4jArray(); - } - - @Override - public INDArray convert(Object object) throws PythonException { - if (object instanceof INDArray) { - return (INDArray) object; - } else if (object instanceof NumpyArray) { - return ((NumpyArray) object).getNd4jArray(); - } - throw new PythonException("Unable to cast " + object + " to INDArray."); - } - }; - - public static final PythonType BYTES = new PythonType(TypeName.BYTES) { - @Override - public BytePointer toJava(PythonObject pythonObject) throws PythonException { - return pythonObject.toBytePointer(); - } - - @Override - public BytePointer convert(Object object) throws PythonException { - if (object instanceof BytePointer) { - return (BytePointer) object; - } else if (object instanceof Pointer) { - return new BytePointer((Pointer) object); - } - throw new PythonException("Unable to cast " + object + " to BytePointer."); - } - }; -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonUtils.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonUtils.java deleted file mode 100644 index 4191ea271..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonUtils.java +++ /dev/null @@ -1,315 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.datavec.api.transform.ColumnType; -import org.datavec.api.transform.metadata.BooleanMetaData; -import org.datavec.api.transform.schema.Schema; -import org.json.JSONArray; -import org.json.JSONObject; -import org.nd4j.common.base.Preconditions; -import org.nd4j.linalg.api.buffer.DataType; - -import java.util.ArrayList; -import java.util.HashMap; -import java.util.List; -import java.util.Map; - -/** - * List of utilities for executing python transforms. - * - * @author Adam Gibson - */ -public class PythonUtils { - - /** - * Create a {@link Schema} - * from {@link PythonVariables}. - * Types are mapped to types of the same name. - * - * @param input the input {@link PythonVariables} - * @return the output {@link Schema} - */ - public static Schema fromPythonVariables(PythonVariables input) { - Schema.Builder schemaBuilder = new Schema.Builder(); - Preconditions.checkState(input.getVariables() != null && input.getVariables().length > 0, "Input must have variables. Found none."); - for (String varName: input.getVariables()) { - - switch (input.getType(varName).getName()) { - case INT: - schemaBuilder.addColumnInteger(varName); - break; - case STR: - schemaBuilder.addColumnString(varName); - break; - case FLOAT: - schemaBuilder.addColumnFloat(varName); - break; - case NDARRAY: - schemaBuilder.addColumnNDArray(varName, null); - break; - case BOOL: - schemaBuilder.addColumn(new BooleanMetaData(varName)); - } - } - - return schemaBuilder.build(); - } - - /** - * Create a {@link Schema} from an input - * {@link PythonVariables} - * Types are mapped to types of the same name - * - * @param input the input schema - * @return the output python variables. - */ - public static PythonVariables fromSchema(Schema input) { - PythonVariables ret = new PythonVariables(); - for (int i = 0; i < input.numColumns(); i++) { - String currColumnName = input.getName(i); - ColumnType columnType = input.getType(i); - switch (columnType) { - case NDArray: - ret.add(currColumnName, PythonType.NDARRAY); - break; - case Boolean: - ret.add(currColumnName, PythonType.BOOL); - break; - case Categorical: - case String: - ret.add(currColumnName, PythonType.STR); - break; - case Double: - case Float: - ret.add(currColumnName, PythonType.FLOAT); - break; - case Integer: - case Long: - ret.add(currColumnName, PythonType.INT); - break; - case Bytes: - ret.add(currColumnName, PythonType.BYTES); - break; - case Time: - throw new UnsupportedOperationException("Unable to process dates with python yet."); - } - } - - return ret; - } - - /** - * Convert a {@link Schema} - * to {@link PythonVariables} - * - * @param schema the input schema - * @return the output {@link PythonVariables} where each - * name in the map is associated with a column name in the schema. - * A proper type is also chosen based on the schema - * @throws Exception - */ - public static PythonVariables schemaToPythonVariables(Schema schema) throws Exception { - PythonVariables pyVars = new PythonVariables(); - int numCols = schema.numColumns(); - for (int i = 0; i < numCols; i++) { - String colName = schema.getName(i); - ColumnType colType = schema.getType(i); - switch (colType) { - case Long: - case Integer: - pyVars.addInt(colName); - break; - case Double: - case Float: - pyVars.addFloat(colName); - break; - case String: - pyVars.addStr(colName); - break; - case NDArray: - pyVars.addNDArray(colName); - break; - case Boolean: - pyVars.addBool(colName); - break; - default: - throw new Exception("Unsupported python input type: " + colType.toString()); - } - } - - return pyVars; - } - - - public static NumpyArray mapToNumpyArray(Map map) { - String dtypeName = (String) map.get("dtype"); - DataType dtype; - if (dtypeName.equals("float64")) { - dtype = DataType.DOUBLE; - } else if (dtypeName.equals("float32")) { - dtype = DataType.FLOAT; - } else if (dtypeName.equals("int16")) { - dtype = DataType.SHORT; - } else if (dtypeName.equals("int32")) { - dtype = DataType.INT; - } else if (dtypeName.equals("int64")) { - dtype = DataType.LONG; - } else { - throw new RuntimeException("Unsupported array type " + dtypeName + "."); - } - List shapeList = (List) map.get("shape"); - long[] shape = new long[shapeList.size()]; - for (int i = 0; i < shape.length; i++) { - shape[i] = (Long) shapeList.get(i); - } - - List strideList = (List) map.get("shape"); - long[] stride = new long[strideList.size()]; - for (int i = 0; i < stride.length; i++) { - stride[i] = (Long) strideList.get(i); - } - long address = (Long) map.get("address"); - NumpyArray numpyArray = new NumpyArray(address, shape, stride, dtype, true); - return numpyArray; - } - - public static PythonVariables expandInnerDict(PythonVariables pyvars, String key) { - Map dict = pyvars.getDictValue(key); - String[] keys = (String[]) dict.keySet().toArray(new String[dict.keySet().size()]); - PythonVariables pyvars2 = new PythonVariables(); - for (String subkey : keys) { - Object value = dict.get(subkey); - if (value instanceof Map) { - Map map = (Map) value; - if (map.containsKey("_is_numpy_array")) { - pyvars2.addNDArray(subkey, mapToNumpyArray(map)); - - } else { - pyvars2.addDict(subkey, (Map) value); - } - - } else if (value instanceof List) { - pyvars2.addList(subkey, ((List) value).toArray()); - } else if (value instanceof String) { - System.out.println((String) value); - pyvars2.addStr(subkey, (String) value); - } else if (value instanceof Integer || value instanceof Long) { - Number number = (Number) value; - pyvars2.addInt(subkey, number.intValue()); - } else if (value instanceof Float || value instanceof Double) { - Number number = (Number) value; - pyvars2.addFloat(subkey, number.doubleValue()); - } else if (value instanceof NumpyArray) { - pyvars2.addNDArray(subkey, (NumpyArray) value); - } else if (value == null) { - pyvars2.addStr(subkey, "None"); // FixMe - } else { - throw new RuntimeException("Unsupported type!" + value); - } - } - return pyvars2; - } - - public static long[] jsonArrayToLongArray(JSONArray jsonArray) { - long[] longs = new long[jsonArray.length()]; - for (int i = 0; i < longs.length; i++) { - - longs[i] = jsonArray.getLong(i); - } - return longs; - } - - public static Map toMap(JSONObject jsonobj) { - Map map = new HashMap<>(); - String[] keys = (String[]) jsonobj.keySet().toArray(new String[jsonobj.keySet().size()]); - for (String key : keys) { - Object value = jsonobj.get(key); - if (value instanceof JSONArray) { - value = toList((JSONArray) value); - } else if (value instanceof JSONObject) { - JSONObject jsonobj2 = (JSONObject) value; - if (jsonobj2.has("_is_numpy_array")) { - value = jsonToNumpyArray(jsonobj2); - } else { - value = toMap(jsonobj2); - } - - } - - map.put(key, value); - } - return map; - } - - - public static List toList(JSONArray array) { - List list = new ArrayList<>(); - for (int i = 0; i < array.length(); i++) { - Object value = array.get(i); - if (value instanceof JSONArray) { - value = toList((JSONArray) value); - } else if (value instanceof JSONObject) { - JSONObject jsonobj2 = (JSONObject) value; - if (jsonobj2.has("_is_numpy_array")) { - value = jsonToNumpyArray(jsonobj2); - } else { - value = toMap(jsonobj2); - } - } - list.add(value); - } - return list; - } - - - private static NumpyArray jsonToNumpyArray(JSONObject map) { - String dtypeName = (String) map.get("dtype"); - DataType dtype; - if (dtypeName.equals("float64")) { - dtype = DataType.DOUBLE; - } else if (dtypeName.equals("float32")) { - dtype = DataType.FLOAT; - } else if (dtypeName.equals("int16")) { - dtype = DataType.SHORT; - } else if (dtypeName.equals("int32")) { - dtype = DataType.INT; - } else if (dtypeName.equals("int64")) { - dtype = DataType.LONG; - } else { - throw new RuntimeException("Unsupported array type " + dtypeName + "."); - } - List shapeList = map.getJSONArray("shape").toList(); - long[] shape = new long[shapeList.size()]; - for (int i = 0; i < shape.length; i++) { - shape[i] = ((Number) shapeList.get(i)).longValue(); - } - - List strideList = map.getJSONArray("shape").toList(); - long[] stride = new long[strideList.size()]; - for (int i = 0; i < stride.length; i++) { - stride[i] = ((Number) strideList.get(i)).longValue(); - } - long address = ((Number) map.get("address")).longValue(); - NumpyArray numpyArray = new NumpyArray(address, shape, stride, dtype, true); - return numpyArray; - } - - -} diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonVariables.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonVariables.java deleted file mode 100644 index 097fc2f76..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/PythonVariables.java +++ /dev/null @@ -1,528 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.bytedeco.javacpp.BytePointer; -import org.nd4j.linalg.api.ndarray.INDArray; - -import java.util.*; - - - -/** - * Holds python variable names, types and values. - * Also handles mapping from java types to python types. - * - * @author Fariz Rahman - */ - -@lombok.Data -public class PythonVariables implements java.io.Serializable { - - - private java.util.Map strVariables = new java.util.LinkedHashMap<>(); - private java.util.Map intVariables = new java.util.LinkedHashMap<>(); - private java.util.Map floatVariables = new java.util.LinkedHashMap<>(); - private java.util.Map boolVariables = new java.util.LinkedHashMap<>(); - private java.util.Map ndVars = new java.util.LinkedHashMap<>(); - private java.util.Map listVariables = new java.util.LinkedHashMap<>(); - private java.util.Map bytesVariables = new java.util.LinkedHashMap<>(); - private java.util.Map> dictVariables = new java.util.LinkedHashMap<>(); - private java.util.Map vars = new java.util.LinkedHashMap<>(); - private java.util.Map maps = new java.util.LinkedHashMap<>(); - - - /** - * Returns a copy of the variable - * schema in this array without the values - * - * @return an empty variables clone - * with no values - */ - public PythonVariables copySchema() { - PythonVariables ret = new PythonVariables(); - for (String varName : getVariables()) { - PythonType type = getType(varName); - ret.add(varName, type); - } - return ret; - } - - /** - * - */ - public PythonVariables() { - maps.put(PythonType.TypeName.BOOL, boolVariables); - maps.put(PythonType.TypeName.STR, strVariables); - maps.put(PythonType.TypeName.INT, intVariables); - maps.put(PythonType.TypeName.FLOAT, floatVariables); - maps.put(PythonType.TypeName.NDARRAY, ndVars); - maps.put(PythonType.TypeName.LIST, listVariables); - maps.put(PythonType.TypeName.DICT, dictVariables); - maps.put(PythonType.TypeName.BYTES, bytesVariables); - - } - - - /** - * @return true if there are no variables. - */ - public boolean isEmpty() { - return getVariables().length < 1; - } - - - /** - * @param name Name of the variable - * @param type Type of the variable - */ - public void add(String name, PythonType type) { - switch (type.getName()) { - case BOOL: - addBool(name); - break; - case STR: - addStr(name); - break; - case INT: - addInt(name); - break; - case FLOAT: - addFloat(name); - break; - case NDARRAY: - addNDArray(name); - break; - case LIST: - addList(name); - break; - case DICT: - addDict(name); - break; - case BYTES: - addBytes(name); - break; - } - } - - /** - * @param name name of the variable - * @param type type of the variable - * @param value value of the variable (must be instance of expected type) - */ - public void add(String name, PythonType type, Object value) throws PythonException { - add(name, type); - setValue(name, value); - } - - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - */ - public void addDict(String name) { - vars.put(name, PythonType.TypeName.DICT); - dictVariables.put(name, null); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - */ - public void addBool(String name) { - vars.put(name, PythonType.TypeName.BOOL); - boolVariables.put(name, null); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - */ - public void addStr(String name) { - vars.put(name, PythonType.TypeName.STR); - strVariables.put(name, null); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - */ - public void addInt(String name) { - vars.put(name, PythonType.TypeName.INT); - intVariables.put(name, null); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - */ - public void addFloat(String name) { - vars.put(name, PythonType.TypeName.FLOAT); - floatVariables.put(name, null); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - */ - public void addNDArray(String name) { - vars.put(name, PythonType.TypeName.NDARRAY); - ndVars.put(name, null); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - */ - public void addList(String name) { - vars.put(name, PythonType.TypeName.LIST); - listVariables.put(name, null); - } - - /** - * Add a boolean variable to - * the set of variables - * - * @param name the field to add - * @param value the value to add - */ - public void addBool(String name, boolean value) { - vars.put(name, PythonType.TypeName.BOOL); - boolVariables.put(name, value); - } - - /** - * Add a string variable to - * the set of variables - * - * @param name the field to add - * @param value the value to add - */ - public void addStr(String name, String value) { - vars.put(name, PythonType.TypeName.STR); - strVariables.put(name, value); - } - - /** - * Add an int variable to - * the set of variables - * - * @param name the field to add - * @param value the value to add - */ - public void addInt(String name, int value) { - vars.put(name, PythonType.TypeName.INT); - intVariables.put(name, (long) value); - } - - /** - * Add a long variable to - * the set of variables - * - * @param name the field to add - * @param value the value to add - */ - public void addInt(String name, long value) { - vars.put(name, PythonType.TypeName.INT); - intVariables.put(name, value); - } - - /** - * Add a double variable to - * the set of variables - * - * @param name the field to add - * @param value the value to add - */ - public void addFloat(String name, double value) { - vars.put(name, PythonType.TypeName.FLOAT); - floatVariables.put(name, value); - } - - /** - * Add a float variable to - * the set of variables - * - * @param name the field to add - * @param value the value to add - */ - public void addFloat(String name, float value) { - vars.put(name, PythonType.TypeName.FLOAT); - floatVariables.put(name, (double) value); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - * @param value the value to add - */ - public void addNDArray(String name, NumpyArray value) { - vars.put(name, PythonType.TypeName.NDARRAY); - ndVars.put(name, value.getNd4jArray()); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - * @param value the value to add - */ - public void addNDArray(String name, INDArray value) { - vars.put(name, PythonType.TypeName.NDARRAY); - ndVars.put(name, value); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - * @param value the value to add - */ - public void addList(String name, Object[] value) { - vars.put(name, PythonType.TypeName.LIST); - listVariables.put(name, Arrays.asList(value)); - } - - /** - * Add a null variable to - * the set of variables - * to describe the type but no value - * - * @param name the field to add - * @param value the value to add - */ - public void addDict(String name, java.util.Map value) { - vars.put(name, PythonType.TypeName.DICT); - dictVariables.put(name, value); - } - - - public void addBytes(String name){ - vars.put(name, PythonType.TypeName.BYTES); - bytesVariables.put(name, null); - } - - public void addBytes(String name, BytePointer value){ - vars.put(name, PythonType.TypeName.BYTES); - bytesVariables.put(name, value); - } - -// public void addBytes(String name, ByteBuffer value){ -// Pointer ptr = NativeOpsHolder.getInstance().getDeviceNativeOps().pointerForAddress((value.address()); -// BytePointer bp = new BytePointer(ptr); -// addBytes(name, bp); -// } - /** - * @param name name of the variable - * @param value new value for the variable - */ - public void setValue(String name, Object value) throws PythonException { - PythonType.TypeName type = vars.get(name); - maps.get(type).put(name, PythonType.valueOf(type).convert(value)); - } - - /** - * Do a general object lookup. - * The look up will happen relative to the {@link PythonType} - * of variable is described in the - * - * @param name the name of the variable to get - * @return teh value for the variable with the given name - */ - public Object getValue(String name) { - PythonType.TypeName type = vars.get(name); - java.util.Map map = maps.get(type); - return map.get(name); - } - - - /** - * Returns a boolean variable with the given name. - * - * @param name the variable name to get the value for - * @return the retrieved boolean value - */ - public boolean getBooleanValue(String name) { - return boolVariables.get(name); - } - - /** - * @param name the variable name - * @return the dictionary value - */ - public java.util.Map getDictValue(String name) { - return dictVariables.get(name); - } - - /** - * /** - * - * @param name the variable name - * @return the string value - */ - public String getStrValue(String name) { - return strVariables.get(name); - } - - /** - * @param name the variable name - * @return the long value - */ - public Long getIntValue(String name) { - return intVariables.get(name); - } - - /** - * @param name the variable name - * @return the float value - */ - public Double getFloatValue(String name) { - return floatVariables.get(name); - } - - /** - * @param name the variable name - * @return the numpy array value - */ - public INDArray getNDArrayValue(String name) { - return ndVars.get(name); - } - - /** - * @param name the variable name - * @return the list value as an object array - */ - public List getListValue(String name) { - return listVariables.get(name); - } - - /** - * @param name the variable name - * @return the bytes value as a BytePointer - */ - public BytePointer getBytesValue(String name){return bytesVariables.get(name);} - /** - * Returns the type for the given variable name - * - * @param name the name of the variable to get the type for - * @return the type for the given variable - */ - public PythonType getType(String name){ - try{ - return PythonType.valueOf(vars.get(name)); // will never fail - }catch (Exception e) - { - throw new RuntimeException(e); - } - } - - /** - * Get all the variables present as a string array - * - * @return the variable names for this variable sset - */ - public String[] getVariables() { - String[] strArr = new String[vars.size()]; - return vars.keySet().toArray(strArr); - } - - - /** - * This variables set as its json representation (an array of json objects) - * - * @return the json array output - */ - public org.json.JSONArray toJSON() { - org.json.JSONArray arr = new org.json.JSONArray(); - for (String varName : getVariables()) { - org.json.JSONObject var = new org.json.JSONObject(); - var.put("name", varName); - String varType = getType(varName).toString(); - var.put("type", varType); - arr.put(var); - } - return arr; - } - - /** - * Create a schema from a map. - * This is an empty PythonVariables - * that just contains names and types with no values - * - * @param inputTypes the input types to convert - * @return the schema from the given map - */ - public static PythonVariables schemaFromMap(java.util.Map inputTypes) throws Exception{ - PythonVariables ret = new PythonVariables(); - for (java.util.Map.Entry entry : inputTypes.entrySet()) { - ret.add(entry.getKey(), PythonType.valueOf(entry.getValue())); - } - - return ret; - } - - /** - * Get the python variable state relative to the - * input json array - * - * @param jsonArray the input json array - * @return the python variables based on the input json array - */ - public static PythonVariables fromJSON(org.json.JSONArray jsonArray) { - PythonVariables pyvars = new PythonVariables(); - for (int i = 0; i < jsonArray.length(); i++) { - org.json.JSONObject input = (org.json.JSONObject) jsonArray.get(i); - String varName = (String) input.get("name"); - String varType = (String) input.get("type"); - pyvars.maps.get(PythonType.TypeName.valueOf(varType)).put(varName, null); - } - - return pyvars; - } - - -} \ No newline at end of file diff --git a/contrib/attic/datavec-python/src/main/java/org/datavec/python/keras/Model.java b/contrib/attic/datavec-python/src/main/java/org/datavec/python/keras/Model.java deleted file mode 100644 index af6747bf4..000000000 --- a/contrib/attic/datavec-python/src/main/java/org/datavec/python/keras/Model.java +++ /dev/null @@ -1,162 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python.keras; - -import org.datavec.python.Python; -import org.datavec.python.PythonException; -import org.datavec.python.PythonObject; -import org.datavec.python.PythonProcess; -import org.nd4j.linalg.api.ndarray.INDArray; - -public class Model { - - private PythonObject pyModel; - - - private static PythonObject installAndImportTF() throws PythonException{ - if (!PythonProcess.isPackageInstalled("tensorflow")){ - PythonProcess.pipInstall("tensorflow"); - } - return Python.importModule("tensorflow"); - } - private static PythonObject getKerasModule() throws PythonException{ - PythonObject tf = installAndImportTF(); - PythonObject keras = tf.attr("keras"); - tf.del(); - return keras; - } - - private static PythonObject loadModel(String s) throws PythonException{ - PythonObject models = getKerasModule().attr("models"); - PythonObject loadModelF = models.attr("load_model"); - PythonObject model = loadModelF.call(s); - models.del(); - loadModelF.del(); - return model; - } - - public Model(String path) throws PythonException{ - pyModel = loadModel(path); - } - - public INDArray[] predict(INDArray... inputs) throws PythonException{ - PythonObject predictF = pyModel.attr("predict"); - PythonObject inputList = new PythonObject(inputs); - PythonObject pyOut = predictF.call(inputList); - INDArray[] out; - if (Python.isinstance(pyOut, Python.listType())){ - out = new INDArray[Python.len(pyOut).toInt()]; - for(int i = 0; i < out.length; i++){ - out[i] = pyOut.get(i).toNumpy().getNd4jArray(); - } - } - else{ - out = new INDArray[]{ - pyOut.toNumpy().getNd4jArray()}; - } - - predictF.del(); - inputList.del(); - pyOut.del(); - return out; - } - - public int numInputs(){ - PythonObject inputs = pyModel.attr("inputs"); - PythonObject pyNumInputs = Python.len(inputs); - int ret = pyNumInputs.toInt(); - inputs.del(); - pyNumInputs.del(); - return ret; - } - public int numOutputs(){ - PythonObject outputs = pyModel.attr("outputs"); - PythonObject pyNumOutputs = Python.len(outputs); - int ret = pyNumOutputs.toInt(); - outputs.del(); - pyNumOutputs.del(); - return ret; - } - - public long[][] inputShapes(){ - long[][] ret = new long[numInputs()][]; - for (int i = 0; i < ret.length; i++){ - ret[i] = inputShapeAt(i); - } - return ret; - } - - public long[][] outputShapes(){ - long[][] ret = new long[numOutputs()][]; - for (int i = 0; i < ret.length; i++){ - ret[i] = outputShapeAt(i); - } - return ret; - } - - public long[] inputShapeAt(int input){ - PythonObject inputs = pyModel.attr("inputs"); - PythonObject tensor = inputs.get(input); - PythonObject tensorShape = tensor.attr("shape"); - PythonObject shapeList = Python.list(tensorShape); - PythonObject pyNdim = Python.len(shapeList); - int ndim = pyNdim.toInt(); - long[] shape = new long[ndim]; - for(int i = 0; i < shape.length; i++){ - PythonObject pyDim = shapeList.get(i); - if (pyDim == null || !Python.isinstance(pyDim, Python.intType())){ - shape[i] = -1; - } - else{ - shape[i] = pyDim.toLong(); - } - } - pyNdim.del(); - shapeList.del(); - tensorShape.del(); - tensor.del(); - inputs.del(); - return shape; - } - - public long[] outputShapeAt(int output){ - PythonObject inputs = pyModel.attr("outputs"); - PythonObject tensor = inputs.get(output); - PythonObject tensorShape = tensor.attr("shape"); - PythonObject shapeList = Python.list(tensorShape); - PythonObject pyNdim = Python.len(shapeList); - int ndim = pyNdim.toInt(); - long[] shape = new long[ndim]; - for(int i = 0; i < shape.length; i++){ - PythonObject pyDim = shapeList.get(i); - if (pyDim == null || !Python.isinstance(pyDim, Python.intType())){ - shape[i] = -1; - } - else{ - shape[i] = pyDim.toLong(); - } - } - pyNdim.del(); - shapeList.del(); - tensorShape.del(); - tensor.del(); - inputs.del(); - return shape; - } -} diff --git a/contrib/attic/datavec-python/src/main/resources/pythonexec/pythonexec.py b/contrib/attic/datavec-python/src/main/resources/pythonexec/pythonexec.py deleted file mode 100644 index ebc1ca67d..000000000 --- a/contrib/attic/datavec-python/src/main/resources/pythonexec/pythonexec.py +++ /dev/null @@ -1,38 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import sys -import traceback -import json -import inspect - -__python_exception__ = "" -try: - pass - sys.stdout.flush() - sys.stderr.flush() -except Exception as ex: - __python_exception__ = ex - try: - exc_info = sys.exc_info() - finally: - print(ex) - traceback.print_exception(*exc_info) - sys.stdout.flush() - sys.stderr.flush() - diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/AssertTestsExtendBaseClass.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/AssertTestsExtendBaseClass.java deleted file mode 100644 index e9c954f28..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/AssertTestsExtendBaseClass.java +++ /dev/null @@ -1,52 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.datavec.python; - -import lombok.extern.slf4j.Slf4j; -import org.nd4j.common.tests.AbstractAssertTestsClass; -import org.nd4j.common.tests.BaseND4JTest; - -import java.util.*; - -/** - * This class checks that all test classes (i.e., anything with one or more methods annotated with @Test) - * extends BaseND4jTest - either directly or indirectly. - * Other than a small set of exceptions, all tests must extend this - * - * @author Alex Black - */ - -@Slf4j -public class AssertTestsExtendBaseClass extends AbstractAssertTestsClass { - - @Override - protected Set> getExclusions() { - //Set of classes that are exclusions to the rule (either run manually or have their own logging + timeouts) - return new HashSet<>(); - } - - @Override - protected String getPackageName() { - return "org.datavec.python"; - } - - @Override - protected Class getBaseClass() { - return BaseND4JTest.class; - } -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/PythonNumpyTest.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/PythonNumpyTest.java deleted file mode 100644 index bde4b70f8..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/PythonNumpyTest.java +++ /dev/null @@ -1,78 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.junit.Test; -import org.junit.runner.RunWith; -import org.junit.runners.Parameterized; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; - -import static junit.framework.TestCase.assertEquals; - -@RunWith(Parameterized.class) -public class PythonNumpyTest { - - @Parameterized.Parameters(name = "{index}: Testing with DataType={0}") - public static DataType[] data() { - return new DataType[] { - DataType.BOOL, - DataType.FLOAT16, - DataType.BFLOAT16, - DataType.FLOAT, - DataType.DOUBLE, - DataType.INT8, - DataType.INT16, - DataType.INT32, - DataType.INT64, - DataType.UINT8, - DataType.UINT16, - DataType.UINT32, - DataType.UINT64 - }; - } - - private DataType dataType; - - public PythonNumpyTest(DataType dataType) { - this.dataType = dataType; - } - - @Test - public void numpyAndNd4jConversions() throws Exception { - INDArray input = Nd4j.ones(dataType, 2, 2, 2); - - PythonVariables inputs = new PythonVariables(); - inputs.addNDArray("x", input); - - PythonVariables outputs = new PythonVariables(); - outputs.addNDArray("y"); - - PythonJob pythonJob = new PythonJob(String.format("job_%s", dataType.name()) + dataType.name(), "y = x", false); - - pythonJob.exec(inputs, outputs); - - INDArray output = outputs.getNDArrayValue("y"); - - // As numpy doesn't support BFLOAT16 we'll convert it to FLOAT - assertEquals(dataType == DataType.BFLOAT16 ? input.castTo(DataType.FLOAT) : input, - output); - } -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/ScalarAndArrayTest.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/ScalarAndArrayTest.java deleted file mode 100644 index 22941890b..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/ScalarAndArrayTest.java +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.junit.Test; -import org.junit.runner.RunWith; -import org.junit.runners.Parameterized; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; - -import static junit.framework.TestCase.assertEquals; - -@RunWith(Parameterized.class) -public class ScalarAndArrayTest { - - @Parameterized.Parameters(name = "{index}: Testing with INDArray={0}") - public static INDArray[] data() { - return new INDArray[]{ - Nd4j.scalar(10), - Nd4j.ones(10, 10, 10, 10) - }; - } - - private INDArray indArray; - - public ScalarAndArrayTest(INDArray indArray) { - this.indArray = indArray; - } - - @Test - public void testINDArray() throws PythonException { - assertEquals(indArray, new PythonObject(indArray).toNumpy().getNd4jArray()); - } -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonContextManager.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonContextManager.java deleted file mode 100644 index 2d74ad2a3..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonContextManager.java +++ /dev/null @@ -1,89 +0,0 @@ - -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - - -import org.junit.Assert; -import org.junit.Test; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; -import javax.annotation.concurrent.NotThreadSafe; - -@NotThreadSafe -public class TestPythonContextManager { - - @Test - public void testInt() throws Exception{ - Python.setContext("context1"); - Python.exec("a = 1"); - Python.setContext("context2"); - Python.exec("a = 2"); - Python.setContext("context3"); - Python.exec("a = 3"); - - - Python.setContext("context1"); - Assert.assertEquals(1, PythonExecutioner.getVariable("a").toInt()); - - Python.setContext("context2"); - Assert.assertEquals(2, PythonExecutioner.getVariable("a").toInt()); - - Python.setContext("context3"); - Assert.assertEquals(3, PythonExecutioner.getVariable("a").toInt()); - - PythonContextManager.deleteNonMainContexts(); - } - - @Test - public void testNDArray() throws Exception{ - Python.setContext("context1"); - Python.exec("import numpy as np"); - Python.exec("a = np.zeros((3,2)) + 1"); - - Python.setContext("context2"); - Python.exec("import numpy as np"); - Python.exec("a = np.zeros((3,2)) + 2"); - - Python.setContext("context3"); - Python.exec("import numpy as np"); - Python.exec("a = np.zeros((3,2)) + 3"); - - Python.setContext("context1"); - Python.exec("a += 1"); - - Python.setContext("context2"); - Python.exec("a += 2"); - - Python.setContext("context3"); - Python.exec("a += 3"); - - INDArray arr = Nd4j.create(DataType.DOUBLE, 3, 2); - Python.setContext("context1"); - Assert.assertEquals(arr.add(2), PythonExecutioner.getVariable("a").toNumpy().getNd4jArray()); - - Python.setContext("context2"); - Assert.assertEquals(arr.add(4), PythonExecutioner.getVariable("a").toNumpy().getNd4jArray()); - - Python.setContext("context3"); - Assert.assertEquals(arr.add(6), PythonExecutioner.getVariable("a").toNumpy().getNd4jArray()); - } - -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonDict.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonDict.java deleted file mode 100644 index f1b5426b5..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonDict.java +++ /dev/null @@ -1,61 +0,0 @@ - -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - - -import org.junit.Test; -import org.nd4j.linalg.factory.Nd4j; - -import java.util.Arrays; -import java.util.HashMap; -import java.util.Map; - -import static org.junit.Assert.assertEquals; - -@javax.annotation.concurrent.NotThreadSafe -public class TestPythonDict { - - @Test - public void testPythonDictFromMap() throws Exception{ - Map map = new HashMap<>(); - map.put("a", 1); - map.put("b", "a"); - map.put("1", Arrays.asList(1, 2, 3, "4", Arrays.asList("x", 2.3))); - Map innerMap = new HashMap<>(); - innerMap.put("k", 32); - map.put("inner", innerMap); - map.put("ndarray", Nd4j.linspace(1, 4, 4)); - innerMap.put("ndarray", Nd4j.linspace(5, 8, 4)); - PythonObject dict = new PythonObject(map); - assertEquals(map.size(), Python.len(dict).toInt()); - assertEquals("{'a': 1, '1': [1, 2, 3, '4', ['" + - "x', 2.3]], 'b': 'a', 'inner': {'k': 32," + - " 'ndarray': array([5., 6., 7., 8.], dty" + - "pe=float32)}, 'ndarray': array([1., 2., " + - "3., 4.], dtype=float32)}", - dict.toString()); - Map map2 = dict.toMap(); - PythonObject dict2 = new PythonObject(map2); - assertEquals(dict.toString(), dict2.toString()); - - - } - -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonExecutioner.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonExecutioner.java deleted file mode 100644 index 564ca4425..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonExecutioner.java +++ /dev/null @@ -1,416 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.bytedeco.javacpp.BytePointer; -import org.junit.Assert; -import org.junit.Ignore; -import org.junit.Test; -import org.nd4j.linalg.api.buffer.BaseDataBuffer; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.nativeblas.OpaqueDataBuffer; - -import java.lang.reflect.Method; - -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.fail; - - -@javax.annotation.concurrent.NotThreadSafe -public class TestPythonExecutioner { - - - @org.junit.Test - public void testPythonSysVersion() throws PythonException { - Python.exec("import sys; print(sys.version)"); - } - - @Test - public void testStr() throws Exception { - - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - pyInputs.addStr("x", "Hello"); - pyInputs.addStr("y", "World"); - - pyOutputs.addStr("z"); - - String code = "z = x + ' ' + y"; - - Python.exec(code, pyInputs, pyOutputs); - - String z = pyOutputs.getStrValue("z"); - - System.out.println(z); - - assertEquals("Hello World", z); - } - - @Test - public void testInt() throws Exception { - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - pyInputs.addInt("x", 10); - pyInputs.addInt("y", 20); - - String code = "z = x + y"; - - pyOutputs.addInt("z"); - - - Python.exec(code, pyInputs, pyOutputs); - - long z = pyOutputs.getIntValue("z"); - - Assert.assertEquals(30, z); - - } - - @Test - public void testList() throws Exception { - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - Object[] x = new Object[]{1L, 2L, 3L, "a", "b", "c"}; - Object[] y = new Object[]{4L, 5L, 6L, "d", "e", "f"}; - - pyInputs.addList("x", x); - pyInputs.addList("y", y); - - String code = "z = x + y"; - - pyOutputs.addList("z"); - - - Python.exec(code, pyInputs, pyOutputs); - - Object[] z = pyOutputs.getListValue("z").toArray(); - - Assert.assertEquals(z.length, x.length + y.length); - - for (int i = 0; i < x.length; i++) { - if (x[i] instanceof Number) { - Number xNum = (Number) x[i]; - Number zNum = (Number) z[i]; - Assert.assertEquals(xNum.intValue(), zNum.intValue()); - } else { - Assert.assertEquals(x[i], z[i]); - } - - } - for (int i = 0; i < y.length; i++) { - if (y[i] instanceof Number) { - Number yNum = (Number) y[i]; - Number zNum = (Number) z[x.length + i]; - Assert.assertEquals(yNum.intValue(), zNum.intValue()); - } else { - Assert.assertEquals(y[i], z[x.length + i]); - - } - - } - - } - - @Test - public void testNDArrayFloat() throws Exception { - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - pyInputs.addNDArray("x", Nd4j.zeros(DataType.FLOAT, 2, 3)); - pyInputs.addNDArray("y", Nd4j.ones(DataType.FLOAT, 2, 3)); - pyOutputs.addNDArray("z"); - - String code = "z = x + y"; - - Python.exec(code, pyInputs, pyOutputs); - INDArray z = pyOutputs.getNDArrayValue("z"); - - Assert.assertEquals(6.0, z.sum().getDouble(0), 1e-5); - - - } - - @Test - @Ignore - public void testTensorflowCustomAnaconda() throws PythonException { - Python.exec("import tensorflow as tf"); - } - - @Test - public void testNDArrayDouble() throws Exception { - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - pyInputs.addNDArray("x", Nd4j.zeros(DataType.DOUBLE, 2, 3)); - pyInputs.addNDArray("y", Nd4j.ones(DataType.DOUBLE, 2, 3)); - pyOutputs.addNDArray("z"); - - String code = "z = x + y"; - - Python.exec(code, pyInputs, pyOutputs); - INDArray z = pyOutputs.getNDArrayValue("z"); - - Assert.assertEquals(6.0, z.sum().getDouble(0), 1e-5); - } - - @Test - public void testNDArrayShort() throws Exception { - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - pyInputs.addNDArray("x", Nd4j.zeros(DataType.SHORT, 2, 3)); - pyInputs.addNDArray("y", Nd4j.ones(DataType.SHORT, 2, 3)); - pyOutputs.addNDArray("z"); - - String code = "z = x + y"; - - Python.exec(code, pyInputs, pyOutputs); - INDArray z = pyOutputs.getNDArrayValue("z"); - - Assert.assertEquals(6.0, z.sum().getDouble(0), 1e-5); - } - - - @Test - public void testNDArrayInt() throws Exception { - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - pyInputs.addNDArray("x", Nd4j.zeros(DataType.INT, 2, 3)); - pyInputs.addNDArray("y", Nd4j.ones(DataType.INT, 2, 3)); - pyOutputs.addNDArray("z"); - - String code = "z = x + y"; - - Python.exec(code, pyInputs, pyOutputs); - INDArray z = pyOutputs.getNDArrayValue("z"); - - Assert.assertEquals(6.0, z.sum().getDouble(0), 1e-5); - - } - - @Test - public void testNDArrayLong() throws Exception { - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - pyInputs.addNDArray("x", Nd4j.zeros(DataType.LONG, 2, 3)); - pyInputs.addNDArray("y", Nd4j.ones(DataType.LONG, 2, 3)); - pyOutputs.addNDArray("z"); - - String code = "z = x + y"; - - Python.exec(code, pyInputs, pyOutputs); - INDArray z = pyOutputs.getNDArrayValue("z"); - - Assert.assertEquals(6.0, z.sum().getDouble(0), 1e-5); - - } - - - @Test - public void testNDArrayNoCopy() throws Exception{ - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - INDArray arr = Nd4j.rand(3, 2); - ((BaseDataBuffer)arr.data()).syncToPrimary(); - pyInputs.addNDArray("x", arr); - pyOutputs.addNDArray("x"); - INDArray expected = arr.mul(2.3); - String code = "x *= 2.3"; - Python.exec(code, pyInputs, pyOutputs); - Assert.assertEquals(pyInputs.getNDArrayValue("x"), pyOutputs.getNDArrayValue("x")); - Assert.assertEquals(expected, pyOutputs.getNDArrayValue("x")); - Assert.assertEquals(arr.data().address(), pyOutputs.getNDArrayValue("x").data().address()); - } - - @Test - public void testNDArrayInplace() throws Exception{ - PythonVariables pyInputs = new PythonVariables(); - INDArray arr = Nd4j.rand(3, 2); - ((BaseDataBuffer)arr.data()).syncToPrimary(); - pyInputs.addNDArray("x", arr); - INDArray expected = arr.mul(2.3); - String code = "x *= 2.3"; - Python.exec(code, pyInputs, null); - Assert.assertEquals(expected, arr); - } - - @Test - public void testByteBufferInput() throws Exception{ - //ByteBuffer buff = ByteBuffer.allocateDirect(3); - INDArray buff = Nd4j.zeros(new int[]{3}, DataType.BYTE); - buff.putScalar(0, 97); // a - buff.putScalar(1, 98); // b - buff.putScalar(2, 99); // c - ((BaseDataBuffer)buff.data()).syncToPrimary(); - PythonVariables pyInputs = new PythonVariables(); - pyInputs.addBytes("buff", new BytePointer(buff.data().pointer())); - - PythonVariables pyOutputs= new PythonVariables(); - pyOutputs.addStr("out"); - - String code = "out = bytes(buff).decode()"; - Python.exec(code, pyInputs, pyOutputs); - Assert.assertEquals("abc", pyOutputs.getStrValue("out")); - - } - - - @Test - public void testByteBufferOutputNoCopy() throws Exception{ - INDArray buff = Nd4j.zeros(new int[]{3}, DataType.BYTE); - buff.putScalar(0, 97); // a - buff.putScalar(1, 98); // b - buff.putScalar(2, 99); // c - ((BaseDataBuffer)buff.data()).syncToPrimary(); - - - PythonVariables pyInputs = new PythonVariables(); - pyInputs.addBytes("buff", new BytePointer(buff.data().pointer())); - - PythonVariables pyOutputs = new PythonVariables(); - pyOutputs.addBytes("buff"); // same name as input, because inplace update - - String code = "buff[0]=99\nbuff[1]=98\nbuff[2]=97"; - Python.exec(code, pyInputs, pyOutputs); - Assert.assertEquals("cba", pyOutputs.getBytesValue("buff").getString()); - Assert.assertEquals(buff.data().address(), pyOutputs.getBytesValue("buff").address()); - } - - @Test - public void testByteBufferInplace() throws Exception{ - INDArray buff = Nd4j.zeros(new int[]{3}, DataType.BYTE); - buff.putScalar(0, 97); // a - buff.putScalar(1, 98); // b - buff.putScalar(2, 99); // c - ((BaseDataBuffer)buff.data()).syncToPrimary(); - - PythonVariables pyInputs = new PythonVariables(); - pyInputs.addBytes("buff", new BytePointer(buff.data().pointer())); - String code = "buff[0]+=2\nbuff[2]-=2"; - Python.exec(code, pyInputs, null); - Assert.assertEquals("cba", pyInputs.getBytesValue("buff").getString()); - INDArray expected = buff.dup(); - expected.putScalar(0, 99); - expected.putScalar(2, 97); - Assert.assertEquals(buff, expected); - - } - - @Test - public void testByteBufferOutputWithCopy() throws Exception{ - INDArray buff = Nd4j.zeros(new int[]{3}, DataType.BYTE); - buff.putScalar(0, 97); // a - buff.putScalar(1, 98); // b - buff.putScalar(2, 99); // c - ((BaseDataBuffer)buff.data()).syncToPrimary(); - - - PythonVariables pyInputs = new PythonVariables(); - pyInputs.addBytes("buff", new BytePointer(buff.data().pointer())); - - PythonVariables pyOutputs = new PythonVariables(); - pyOutputs.addBytes("out"); - - String code = "buff[0]=99\nbuff[1]=98\nbuff[2]=97\nout=bytes(buff)"; - Python.exec(code, pyInputs, pyOutputs); - Assert.assertEquals("cba", pyOutputs.getBytesValue("out").getString()); - } - - @Test - public void testDoubleDeviceAllocation() throws Exception{ - if(!"CUDA".equalsIgnoreCase(Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"))){ - return; - } - // Test to make sure that multiple device buffers are not allocated - // for the same host buffer - INDArray arr = Nd4j.rand(3, 2); - ((BaseDataBuffer)arr.data()).syncToPrimary(); - long deviceAddress1 = getDeviceAddress(arr); - PythonVariables pyInputs = new PythonVariables(); - pyInputs.addNDArray("arr", arr); - PythonVariables pyOutputs = new PythonVariables(); - pyOutputs.addNDArray("arr"); - String code = "arr += 2"; - Python.exec(code, pyInputs, pyOutputs); - INDArray arr2 = pyOutputs.getNDArrayValue("arr"); - long deviceAddress2 = getDeviceAddress(arr2); - Assert.assertEquals(deviceAddress1, deviceAddress2); - - - } - - @Test - public void testBadCode() throws Exception{ - Python.setContext("badcode"); - PythonVariables pyInputs = new PythonVariables(); - PythonVariables pyOutputs = new PythonVariables(); - - pyInputs.addNDArray("x", Nd4j.zeros(DataType.LONG, 2, 3)); - pyInputs.addNDArray("y", Nd4j.ones(DataType.LONG, 2, 3)); - pyOutputs.addNDArray("z"); - - String code = "z = x + a"; - - try{ - Python.exec(code, pyInputs, pyOutputs); - fail("No exception thrown"); - } catch (PythonException pe ){ - Assert.assertEquals("NameError: name 'a' is not defined", pe.getMessage()); - } - - Python.setMainContext(); - } - - @Test - public void testIsNone(){ - PythonObject d = Python.dict(); - PythonObject none = d.attr("get").call("x"); - Assert.assertTrue(none.isNone()); - d.set(new PythonObject("x"), new PythonObject("y")); - PythonObject notNone = d.attr("get").call("x"); - Assert.assertFalse(notNone.isNone()); - Assert.assertEquals("y", notNone.toString()); - } - - private static long getDeviceAddress(INDArray array){ - if(!"CUDA".equalsIgnoreCase(Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"))){ - throw new IllegalStateException("Cannot ge device pointer for non-CUDA device"); - } - - //Use reflection here as OpaqueDataBuffer is only available on BaseCudaDataBuffer and BaseCpuDataBuffer - not DataBuffer/BaseDataBuffer - // due to it being defined in nd4j-native-api, not nd4j-api - try { - Class c = Class.forName("org.nd4j.linalg.jcublas.buffer.BaseCudaDataBuffer"); - Method m = c.getMethod("getOpaqueDataBuffer"); - OpaqueDataBuffer db = (OpaqueDataBuffer) m.invoke(array.data()); - long address = db.specialBuffer().address(); - return address; - } catch (Throwable t){ - throw new RuntimeException("Error getting OpaqueDataBuffer", t); - } - } - -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonJob.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonJob.java deleted file mode 100644 index 967908944..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonJob.java +++ /dev/null @@ -1,325 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; -import org.junit.Test; -import org.nd4j.linalg.factory.Nd4j; - -import static org.junit.Assert.assertEquals; - - -@javax.annotation.concurrent.NotThreadSafe -public class TestPythonJob { - - @Test - public void testPythonJobBasic() throws Exception{ - PythonContextManager.deleteNonMainContexts(); - - String code = "c = a + b"; - PythonJob job = new PythonJob("job1", code, false); - - PythonVariables inputs = new PythonVariables(); - inputs.addInt("a", 2); - inputs.addInt("b", 3); - - PythonVariables outputs = new PythonVariables(); - outputs.addInt("c"); - - job.exec(inputs, outputs); - - assertEquals(5L, (long)outputs.getIntValue("c")); - - inputs = new PythonVariables(); - inputs.addFloat("a", 3.0); - inputs.addFloat("b", 4.0); - - outputs = new PythonVariables(); - outputs.addFloat("c"); - - - job.exec(inputs, outputs); - - assertEquals(7.0, outputs.getFloatValue("c"), 1e-5); - - - inputs = new PythonVariables(); - inputs.addNDArray("a", Nd4j.zeros(3, 2).add(4)); - inputs.addNDArray("b", Nd4j.zeros(3, 2).add(5)); - - outputs = new PythonVariables(); - outputs.addNDArray("c"); - - - job.exec(inputs, outputs); - - assertEquals(Nd4j.zeros(3, 2).add(9), outputs.getNDArrayValue("c")); - } - - @Test - public void testPythonJobReturnAllVariables()throws Exception{ - PythonContextManager.deleteNonMainContexts(); - - String code = "c = a + b"; - PythonJob job = new PythonJob("job1", code, false); - - PythonVariables inputs = new PythonVariables(); - inputs.addInt("a", 2); - inputs.addInt("b", 3); - - - PythonVariables outputs = job.execAndReturnAllVariables(inputs); - - assertEquals(5L, (long)outputs.getIntValue("c")); - - inputs = new PythonVariables(); - inputs.addFloat("a", 3.0); - inputs.addFloat("b", 4.0); - - outputs = job.execAndReturnAllVariables(inputs); - - assertEquals(7.0, outputs.getFloatValue("c"), 1e-5); - - - inputs = new PythonVariables(); - inputs.addNDArray("a", Nd4j.zeros(3, 2).add(4)); - inputs.addNDArray("b", Nd4j.zeros(3, 2).add(5)); - - outputs = job.execAndReturnAllVariables(inputs); - - assertEquals(Nd4j.zeros(3, 2).add(9), outputs.getNDArrayValue("c")); - } - - @Test - public void testMultiplePythonJobsParallel()throws Exception{ - PythonContextManager.deleteNonMainContexts(); - - String code1 = "c = a + b"; - PythonJob job1 = new PythonJob("job1", code1, false); - - String code2 = "c = a - b"; - PythonJob job2 = new PythonJob("job2", code2, false); - - PythonVariables inputs = new PythonVariables(); - inputs.addInt("a", 2); - inputs.addInt("b", 3); - - PythonVariables outputs = new PythonVariables(); - outputs.addInt("c"); - - job1.exec(inputs, outputs); - - assertEquals(5L, (long)outputs.getIntValue("c")); - - job2.exec(inputs, outputs); - - assertEquals(-1L, (long)outputs.getIntValue("c")); - - inputs = new PythonVariables(); - inputs.addFloat("a", 3.0); - inputs.addFloat("b", 4.0); - - outputs = new PythonVariables(); - outputs.addFloat("c"); - - - job1.exec(inputs, outputs); - - assertEquals(7.0, outputs.getFloatValue("c"), 1e-5); - - job2.exec(inputs, outputs); - - assertEquals(-1L, outputs.getFloatValue("c"), 1e-5); - - - inputs = new PythonVariables(); - inputs.addNDArray("a", Nd4j.zeros(3, 2).add(4)); - inputs.addNDArray("b", Nd4j.zeros(3, 2).add(5)); - - outputs = new PythonVariables(); - outputs.addNDArray("c"); - - - job1.exec(inputs, outputs); - - assertEquals(Nd4j.zeros(3, 2).add(9), outputs.getNDArrayValue("c")); - - job2.exec(inputs, outputs); - - assertEquals(Nd4j.zeros(3, 2).sub(1), outputs.getNDArrayValue("c")); - } - @Test - public void testPythonJobSetupRun()throws Exception{ - PythonContextManager.deleteNonMainContexts(); - - String code = "five=None\n" + - "def setup():\n" + - " global five\n"+ - " five = 5\n\n" + - "def run(a, b):\n" + - " c = a + b + five\n"+ - " return {'c':c}\n\n"; - PythonJob job = new PythonJob("job1", code, true); - - PythonVariables inputs = new PythonVariables(); - inputs.addInt("a", 2); - inputs.addInt("b", 3); - - PythonVariables outputs = new PythonVariables(); - outputs.addInt("c"); - - job.exec(inputs, outputs); - - assertEquals(10L, (long)outputs.getIntValue("c")); - - inputs = new PythonVariables(); - inputs.addFloat("a", 3.0); - inputs.addFloat("b", 4.0); - - outputs = new PythonVariables(); - outputs.addFloat("c"); - - - job.exec(inputs, outputs); - - assertEquals(12.0, outputs.getFloatValue("c"), 1e-5); - - - inputs = new PythonVariables(); - inputs.addNDArray("a", Nd4j.zeros(3, 2).add(4)); - inputs.addNDArray("b", Nd4j.zeros(3, 2).add(5)); - - outputs = new PythonVariables(); - outputs.addNDArray("c"); - - - job.exec(inputs, outputs); - - assertEquals(Nd4j.zeros(3, 2).add(14), outputs.getNDArrayValue("c")); - } - @Test - public void testPythonJobSetupRunAndReturnAllVariables()throws Exception{ - PythonContextManager.deleteNonMainContexts(); - - String code = "five=None\n" + - "def setup():\n" + - " global five\n"+ - " five = 5\n\n" + - "def run(a, b):\n" + - " c = a + b + five\n"+ - " return {'c':c}\n\n"; - PythonJob job = new PythonJob("job1", code, true); - - PythonVariables inputs = new PythonVariables(); - inputs.addInt("a", 2); - inputs.addInt("b", 3); - - - PythonVariables outputs = job.execAndReturnAllVariables(inputs); - - assertEquals(10L, (long)outputs.getIntValue("c")); - - inputs = new PythonVariables(); - inputs.addFloat("a", 3.0); - inputs.addFloat("b", 4.0); - - outputs = job.execAndReturnAllVariables(inputs); - - assertEquals(12.0, outputs.getFloatValue("c"), 1e-5); - - - inputs = new PythonVariables(); - inputs.addNDArray("a", Nd4j.zeros(3, 2).add(4)); - inputs.addNDArray("b", Nd4j.zeros(3, 2).add(5)); - - outputs = job.execAndReturnAllVariables(inputs); - - assertEquals(Nd4j.zeros(3, 2).add(14), outputs.getNDArrayValue("c")); - } - - @Test - public void testMultiplePythonJobsSetupRunParallel()throws Exception{ - PythonContextManager.deleteNonMainContexts(); - - String code1 = "five=None\n" + - "def setup():\n" + - " global five\n"+ - " five = 5\n\n" + - "def run(a, b):\n" + - " c = a + b + five\n"+ - " return {'c':c}\n\n"; - PythonJob job1 = new PythonJob("job1", code1, true); - - String code2 = "five=None\n" + - "def setup():\n" + - " global five\n"+ - " five = 5\n\n" + - "def run(a, b):\n" + - " c = a + b - five\n"+ - " return {'c':c}\n\n"; - PythonJob job2 = new PythonJob("job2", code2, true); - - PythonVariables inputs = new PythonVariables(); - inputs.addInt("a", 2); - inputs.addInt("b", 3); - - PythonVariables outputs = new PythonVariables(); - outputs.addInt("c"); - - job1.exec(inputs, outputs); - - assertEquals(10L, (long)outputs.getIntValue("c")); - - job2.exec(inputs, outputs); - - assertEquals(0L, (long)outputs.getIntValue("c")); - - inputs = new PythonVariables(); - inputs.addFloat("a", 3.0); - inputs.addFloat("b", 4.0); - - outputs = new PythonVariables(); - outputs.addFloat("c"); - - - job1.exec(inputs, outputs); - - assertEquals(12.0, outputs.getFloatValue("c"), 1e-5); - - job2.exec(inputs, outputs); - - assertEquals(2L, outputs.getFloatValue("c"), 1e-5); - - - inputs = new PythonVariables(); - inputs.addNDArray("a", Nd4j.zeros(3, 2).add(4)); - inputs.addNDArray("b", Nd4j.zeros(3, 2).add(5)); - - outputs = new PythonVariables(); - outputs.addNDArray("c"); - - - job1.exec(inputs, outputs); - - assertEquals(Nd4j.zeros(3, 2).add(14), outputs.getNDArrayValue("c")); - - job2.exec(inputs, outputs); - - assertEquals(Nd4j.zeros(3, 2).add(4), outputs.getNDArrayValue("c")); - } - -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonList.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonList.java deleted file mode 100644 index 6dde5b116..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonList.java +++ /dev/null @@ -1,106 +0,0 @@ - -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - - -import org.junit.Test; -import org.nd4j.linalg.factory.Nd4j; - -import java.util.*; - -import static org.junit.Assert.assertEquals; - -@javax.annotation.concurrent.NotThreadSafe -public class TestPythonList { - - @Test - public void testPythonListFromIntArray() { - PythonObject pyList = new PythonObject(new Integer[]{1, 2, 3, 4, 5}); - pyList.attr("append").call(6); - pyList.attr("append").call(7); - pyList.attr("append").call(8); - assertEquals(8, Python.len(pyList).toInt()); - for (int i = 0; i < 8; i++) { - assertEquals(i + 1, pyList.get(i).toInt()); - } - - } - - @Test - public void testPythonListFromLongArray() { - PythonObject pyList = new PythonObject(new Long[]{1L, 2L, 3L, 4L, 5L}); - pyList.attr("append").call(6); - pyList.attr("append").call(7); - pyList.attr("append").call(8); - assertEquals(8, Python.len(pyList).toInt()); - for (int i = 0; i < 8; i++) { - assertEquals(i + 1, pyList.get(i).toInt()); - } - - } - - @Test - public void testPythonListFromDoubleArray() { - PythonObject pyList = new PythonObject(new Double[]{1., 2., 3., 4., 5.}); - pyList.attr("append").call(6); - pyList.attr("append").call(7); - pyList.attr("append").call(8); - assertEquals(8, Python.len(pyList).toInt()); - for (int i = 0; i < 8; i++) { - assertEquals(i + 1, pyList.get(i).toInt()); - assertEquals((double) i + 1, pyList.get(i).toDouble(), 1e-5); - } - - } - - @Test - public void testPythonListFromStringArray() { - PythonObject pyList = new PythonObject(new String[]{"abcd", "efg"}); - pyList.attr("append").call("hijk"); - pyList.attr("append").call("lmnop"); - assertEquals("abcdefghijklmnop", new PythonObject("").attr("join").call(pyList).toString()); - } - - @Test - public void testPythonListFromMixedArray()throws Exception { - Map map = new HashMap<>(); - map.put(1, "a"); - map.put("a", Arrays.asList("a", "b", "c")); - map.put("arr", Nd4j.linspace(1, 4, 4)); - Object[] objs = new Object[]{ - 1, 2, "a", 3f, 4L, 5.0, Arrays.asList(10, - 20, "b", 30f, 40L, 50.0, map - - ), map - }; - PythonObject pyList = new PythonObject(objs); - System.out.println(pyList.toString()); - String expectedStr = "[1, 2, 'a', 3.0, 4, 5.0, [10" + - ", 20, 'b', 30.0, 40, 50.0, {'arr': array([1.," + - " 2., 3., 4.], dtype=float32), 1: 'a', 'a': [" + - "'a', 'b', 'c']}], {'arr': array([1., 2., 3.," + - " 4.], dtype=float32), 1: 'a', 'a': ['a', 'b', 'c']}]"; - assertEquals(expectedStr, pyList.toString()); - List objs2 = pyList.toList(); - PythonObject pyList2 = new PythonObject(objs2); - assertEquals(pyList.toString(), pyList2.toString()); - } - -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonVariables.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonVariables.java deleted file mode 100644 index 82a537a22..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestPythonVariables.java +++ /dev/null @@ -1,94 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.bytedeco.javacpp.BytePointer; -import org.junit.Test; -import org.nd4j.linalg.api.buffer.BaseDataBuffer; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; - -import java.util.Arrays; -import java.util.Collections; -import java.util.List; - -import static junit.framework.TestCase.assertNotNull; -import static junit.framework.TestCase.assertNull; -import static org.junit.Assert.assertArrayEquals; -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertTrue; - -public class TestPythonVariables { - - @Test - public void testDataAssociations() throws PythonException{ - PythonVariables pythonVariables = new PythonVariables(); - PythonType[] types = { - PythonType.INT, - PythonType.FLOAT, - PythonType.STR, - PythonType.BOOL, - PythonType.DICT, - PythonType.LIST, - PythonType.LIST, - PythonType.NDARRAY, - PythonType.BYTES - }; - - INDArray arr = Nd4j.scalar(1.0); - ((BaseDataBuffer)arr.data()).syncToPrimary(); - BytePointer bp = new BytePointer(arr.data().pointer()); - Object[] values = { - 1L,1.0,"1",true, Collections.singletonMap("1",1), - new Object[]{1}, Arrays.asList(1), arr, bp - }; - - Object[] expectedValues = { - 1L,1.0,"1",true, Collections.singletonMap("1",1), - Arrays.asList(1), Arrays.asList(1), arr, bp - }; - - for(int i = 0; i < types.length; i++) { - testInsertGet(pythonVariables,types[i].getName().name() + i,values[i],types[i],expectedValues[i]); - } - - assertEquals(types.length,pythonVariables.getVariables().length); - - } - - private void testInsertGet(PythonVariables pythonVariables,String key,Object value,PythonType type,Object expectedValue) throws PythonException{ - pythonVariables.add(key, type); - assertNull(pythonVariables.getValue(key)); - pythonVariables.setValue(key,value); - assertNotNull(pythonVariables.getValue(key)); - Object actualValue = pythonVariables.getValue(key); - if (expectedValue instanceof Object[]){ - assertTrue(actualValue instanceof List); - Object[] actualArr = ((List)actualValue).toArray(); - Object[] expectedArr = (Object[])expectedValue; - assertArrayEquals(expectedArr, actualArr); - } - else{ - assertEquals(expectedValue,pythonVariables.getValue(key)); - } - - } - - -} diff --git a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestSerde.java b/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestSerde.java deleted file mode 100644 index 146c68331..000000000 --- a/contrib/attic/datavec-python/src/test/java/org/datavec/python/TestSerde.java +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.datavec.python; - -import org.datavec.api.transform.Transform; -import org.datavec.api.transform.schema.Schema; -import org.datavec.api.transform.serde.JsonSerializer; -import org.datavec.api.transform.serde.YamlSerializer; -import static org.junit.Assert.assertEquals; -import org.junit.Test; - -public class TestSerde { - - public static YamlSerializer y = new YamlSerializer(); - public static JsonSerializer j = new JsonSerializer(); - - @Test(timeout = 60000L) - public void testBasicSerde(){ - Schema schema = new Schema.Builder() - .addColumnInteger("col1") - .addColumnFloat("col2") - .addColumnString("col3") - .addColumnDouble("col4") - .build(); - - Transform t = PythonTransform.builder().code( - "col1+=3\ncol2+=2\ncol3+='a'\ncol4+=2.0" - ).inputSchema(schema).outputSchema(schema).build(); - - String yaml = y.serialize(t); - String json = j.serialize(t); - - Transform t2 = y.deserializeTransform(yaml); - Transform t3 = j.deserializeTransform(json); - assertEquals(t, t2); - assertEquals(t, t3); - } - -} diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/pom.xml b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/pom.xml deleted file mode 100644 index cd4f4f8d6..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/pom.xml +++ /dev/null @@ -1,118 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j-remote - 1.0.0-SNAPSHOT - - - deeplearning4j-json-server - - deeplearning4j-json-server - - - - junit - junit - - - org.projectlombok - lombok - ${lombok.version} - provided - - - org.nd4j - nd4j-api - ${project.version} - - - org.nd4j - nd4j-json-client - ${project.version} - - - org.nd4j - nd4j-json-server - ${project.version} - - - org.deeplearning4j - deeplearning4j-parallel-wrapper - ${project.version} - - - org.slf4j - slf4j-api - - - ch.qos.logback - logback-core - test - - - ch.qos.logback - logback-classic - test - - - org.deeplearning4j - deeplearning4j-common-tests - ${project.version} - test - - - - - - testresources - - - test-nd4j-native - - - org.nd4j - nd4j-native - ${project.version} - test - - - - - test-nd4j-cuda-11.0 - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - test - - - - - diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/main/java/org/deeplearning4j/remote/DL4jServlet.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/main/java/org/deeplearning4j/remote/DL4jServlet.java deleted file mode 100644 index ffe43176a..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/main/java/org/deeplearning4j/remote/DL4jServlet.java +++ /dev/null @@ -1,290 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote; - -import lombok.*; -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.nn.api.Model; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.deeplearning4j.parallelism.ParallelInference; -import org.nd4j.adapters.InferenceAdapter; -import org.nd4j.common.base.Preconditions; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.remote.clients.serde.BinaryDeserializer; -import org.nd4j.remote.clients.serde.BinarySerializer; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; -import org.nd4j.remote.serving.SameDiffServlet; - -import javax.servlet.http.HttpServletRequest; -import javax.servlet.http.HttpServletResponse; -import java.io.BufferedReader; -import java.io.IOException; -import java.io.InputStreamReader; - - - -/** - * - * @author astoyakin - */ -@Slf4j -@NoArgsConstructor -public class DL4jServlet extends SameDiffServlet { - - protected ParallelInference parallelInference; - protected Model model; - protected boolean parallelEnabled = true; - - public DL4jServlet(@NonNull ParallelInference parallelInference, @NonNull InferenceAdapter inferenceAdapter, - JsonSerializer serializer, JsonDeserializer deserializer) { - super(inferenceAdapter, serializer, deserializer); - this.parallelInference = parallelInference; - this.model = null; - this.parallelEnabled = true; - } - - public DL4jServlet(@NonNull Model model, @NonNull InferenceAdapter inferenceAdapter, - JsonSerializer serializer, JsonDeserializer deserializer) { - super(inferenceAdapter, serializer, deserializer); - this.model = model; - this.parallelInference = null; - this.parallelEnabled = false; - } - - public DL4jServlet(@NonNull ParallelInference parallelInference, @NonNull InferenceAdapter inferenceAdapter, - BinarySerializer serializer, BinaryDeserializer deserializer) { - super(inferenceAdapter, serializer, deserializer); - this.parallelInference = parallelInference; - this.model = null; - this.parallelEnabled = true; - } - - public DL4jServlet(@NonNull Model model, @NonNull InferenceAdapter inferenceAdapter, - JsonSerializer jsonSerializer, JsonDeserializer jsonDeserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer) { - super(inferenceAdapter, jsonSerializer, jsonDeserializer, binarySerializer, binaryDeserializer); - this.model = model; - this.parallelInference = null; - this.parallelEnabled = false; - } - - public DL4jServlet(@NonNull ParallelInference parallelInference, @NonNull InferenceAdapter inferenceAdapter, - JsonSerializer jsonSerializer, JsonDeserializer jsonDeserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer) { - super(inferenceAdapter, jsonSerializer, jsonDeserializer, binarySerializer, binaryDeserializer); - this.parallelInference = parallelInference; - this.model = null; - this.parallelEnabled = true; - } - - private O process(MultiDataSet mds) { - O result = null; - if (parallelEnabled) { - // process result - result = inferenceAdapter.apply(parallelInference.output(mds.getFeatures(), mds.getFeaturesMaskArrays())); - } else { - synchronized (this) { - if (model instanceof ComputationGraph) - result = inferenceAdapter.apply(((ComputationGraph) model).output(false, mds.getFeatures(), mds.getFeaturesMaskArrays())); - else if (model instanceof MultiLayerNetwork) { - Preconditions.checkArgument(mds.getFeatures().length > 0 || (mds.getFeaturesMaskArrays() != null && mds.getFeaturesMaskArrays().length > 0), - "Input data for MultilayerNetwork is invalid!"); - result = inferenceAdapter.apply(((MultiLayerNetwork) model).output(mds.getFeatures()[0], false, - mds.getFeaturesMaskArrays() != null ? mds.getFeaturesMaskArrays()[0] : null, null)); - } - } - } - return result; - } - - @Override - protected void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException { - String processorReturned = ""; - MultiDataSet mds = null; - String path = request.getPathInfo(); - if (path.equals(SERVING_ENDPOINT)) { - val contentType = request.getContentType(); - if (contentType.equals(typeJson)) { - if (validateRequest(request, response)) { - val stream = request.getInputStream(); - val bufferedReader = new BufferedReader(new InputStreamReader(stream)); - char[] charBuffer = new char[128]; - int bytesRead = -1; - val buffer = new StringBuilder(); - while ((bytesRead = bufferedReader.read(charBuffer)) > 0) { - buffer.append(charBuffer, 0, bytesRead); - } - val requestString = buffer.toString(); - - mds = inferenceAdapter.apply(deserializer.deserialize(requestString)); - } - } - else if (contentType.equals(typeBinary)) { - val stream = request.getInputStream(); - int available = request.getContentLength(); - if (available <= 0) { - response.sendError(411, "Content length is unavailable"); - } - else { - byte[] data = new byte[available]; - stream.read(data, 0, available); - - mds = inferenceAdapter.apply(binaryDeserializer.deserialize(data)); - } - } - if (mds == null) - log.error("InferenceAdapter failed"); - else { - val result = process(mds); - if (binarySerializer != null) { - byte[] serialized = binarySerializer.serialize(result); - response.setContentType(typeBinary); - response.setContentLength(serialized.length); - val out = response.getOutputStream(); - out.write(serialized); - } - else { - processorReturned = serializer.serialize(result); - try { - val out = response.getWriter(); - out.write(processorReturned); - } catch (IOException e) { - log.error(e.getMessage()); - } - } - } - } else { - // we return error otherwise - sendError(request.getRequestURI(), response); - } - } - - /** - * Creates servlet to serve models - * - * @param type of Input class - * @param type of Output class - * - * @author raver119@gmail.com - * @author astoyakin - */ - public static class Builder { - - private ParallelInference pi; - private Model model; - - private InferenceAdapter inferenceAdapter; - private JsonSerializer serializer; - private JsonDeserializer deserializer; - private BinarySerializer binarySerializer; - private BinaryDeserializer binaryDeserializer; - private int port; - private boolean parallelEnabled = true; - - public Builder(@NonNull ParallelInference pi) { - this.pi = pi; - } - - public Builder(@NonNull Model model) { - this.model = model; - } - - public Builder inferenceAdapter(@NonNull InferenceAdapter inferenceAdapter) { - this.inferenceAdapter = inferenceAdapter; - return this; - } - - /** - * This method is required to specify serializer - * - * @param serializer - * @return - */ - public Builder serializer(JsonSerializer serializer) { - this.serializer = serializer; - return this; - } - - /** - * This method allows to specify deserializer - * - * @param deserializer - * @return - */ - public Builder deserializer(JsonDeserializer deserializer) { - this.deserializer = deserializer; - return this; - } - - /** - * This method is required to specify serializer - * - * @param serializer - * @return - */ - public Builder binarySerializer(BinarySerializer serializer) { - this.binarySerializer = serializer; - return this; - } - - /** - * This method allows to specify deserializer - * - * @param deserializer - * @return - */ - public Builder binaryDeserializer(BinaryDeserializer deserializer) { - this.binaryDeserializer = deserializer; - return this; - } - - /** - * This method allows to specify port - * - * @param port - * @return - */ - public Builder port(int port) { - this.port = port; - return this; - } - - /** - * This method activates parallel inference - * - * @param parallelEnabled - * @return - */ - public Builder parallelEnabled(boolean parallelEnabled) { - this.parallelEnabled = parallelEnabled; - return this; - } - - public DL4jServlet build() { - return parallelEnabled ? new DL4jServlet(pi, inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer) : - new DL4jServlet(model, inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer); - } - } -} - - - - diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/main/java/org/deeplearning4j/remote/JsonModelServer.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/main/java/org/deeplearning4j/remote/JsonModelServer.java deleted file mode 100644 index ec8c5398a..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/main/java/org/deeplearning4j/remote/JsonModelServer.java +++ /dev/null @@ -1,451 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote; - -import lombok.NonNull; -import lombok.val; -import org.deeplearning4j.nn.api.Model; -import org.deeplearning4j.nn.api.ModelAdapter; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.deeplearning4j.parallelism.ParallelInference; -import org.deeplearning4j.parallelism.inference.InferenceMode; -import org.deeplearning4j.parallelism.inference.LoadBalanceMode; -import org.nd4j.adapters.InferenceAdapter; -import org.nd4j.adapters.InputAdapter; -import org.nd4j.adapters.OutputAdapter; -import org.nd4j.autodiff.samediff.SameDiff; -import org.nd4j.common.base.Preconditions; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.remote.SameDiffJsonModelServer; -import org.nd4j.remote.clients.serde.BinaryDeserializer; -import org.nd4j.remote.clients.serde.BinarySerializer; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; - - -import java.util.List; - -/** - * This class provides JSON-based model serving ability for Deeplearning4j/SameDiff models - * - * Server url will be http://0.0.0.0:{port}>/v1/serving - * Server only accepts POST requests - * - * @param type of the input class, i.e. String - * @param type of the output class, i.e. Sentiment - * - * @author raver119@gmail.com - * @author astoyakin - */ -public class JsonModelServer extends SameDiffJsonModelServer { - - // all serving goes through ParallelInference - protected ParallelInference parallelInference; - - - protected ModelAdapter modelAdapter; - - // actual models - protected ComputationGraph cgModel; - protected MultiLayerNetwork mlnModel; - - // service stuff - protected InferenceMode inferenceMode; - protected int numWorkers; - - protected boolean enabledParallel = true; - - protected JsonModelServer(@NonNull SameDiff sdModel, InferenceAdapter inferenceAdapter, - JsonSerializer serializer, JsonDeserializer deserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer, - int port, String[] orderedInputNodes, String[] orderedOutputNodes) { - super(sdModel, inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port, orderedInputNodes, orderedOutputNodes); - } - - protected JsonModelServer(@NonNull ComputationGraph cgModel, InferenceAdapter inferenceAdapter, - JsonSerializer serializer, JsonDeserializer deserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer, - int port, @NonNull InferenceMode inferenceMode, int numWorkers) { - super(inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port); - - this.cgModel = cgModel; - this.inferenceMode = inferenceMode; - this.numWorkers = numWorkers; - } - - protected JsonModelServer(@NonNull MultiLayerNetwork mlnModel, InferenceAdapter inferenceAdapter, - JsonSerializer serializer, JsonDeserializer deserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer, - int port, @NonNull InferenceMode inferenceMode, int numWorkers) { - super(inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port); - - this.mlnModel = mlnModel; - this.inferenceMode = inferenceMode; - this.numWorkers = numWorkers; - } - - protected JsonModelServer(@NonNull ParallelInference pi, InferenceAdapter inferenceAdapter, - JsonSerializer serializer, JsonDeserializer deserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer, - int port) { - super(inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port); - - this.parallelInference = pi; - } - - /** - * This method stops server - * - * @throws Exception - */ - @Override - public void stop() throws Exception { - if (parallelInference != null) - parallelInference.shutdown(); - super.stop(); - } - - /** - * This method starts server - * @throws Exception - */ - @Override - public void start() throws Exception { - // if we're just serving sdModel - we'll just call super. no dl4j functionality required in this case - if (sdModel != null) { - super.start(); - return; - } - Preconditions.checkArgument(cgModel != null || mlnModel != null, "Model serving requires either MultilayerNetwork or ComputationGraph defined"); - - val model = cgModel != null ? (Model) cgModel : (Model) mlnModel; - // PI construction is optional, since we can have it defined - if (enabledParallel) { - if (parallelInference == null) { - Preconditions.checkArgument(numWorkers >= 1, "Number of workers should be >= 1, got " + numWorkers + " instead"); - - parallelInference = new ParallelInference.Builder(model) - .inferenceMode(inferenceMode) - .workers(numWorkers) - .loadBalanceMode(LoadBalanceMode.FIFO) - .batchLimit(16) - .queueLimit(128) - .build(); - } - servingServlet = new DL4jServlet.Builder(parallelInference) - .parallelEnabled(true) - .serializer(serializer) - .deserializer(deserializer) - .binarySerializer(binarySerializer) - .binaryDeserializer(binaryDeserializer) - .inferenceAdapter(inferenceAdapter) - .build(); - } - else { - servingServlet = new DL4jServlet.Builder(model) - .parallelEnabled(false) - .serializer(serializer) - .deserializer(deserializer) - .binarySerializer(binarySerializer) - .binaryDeserializer(binaryDeserializer) - .inferenceAdapter(inferenceAdapter) - .build(); - } - start(port, servingServlet); - } - - /** - * Creates servlet to serve different types of models - * - * @param type of Input class - * @param type of Output class - * - * @author raver119@gmail.com - * @author astoyakin - */ - public static class Builder { - - private SameDiff sdModel; - private ComputationGraph cgModel; - private MultiLayerNetwork mlnModel; - private ParallelInference pi; - - private String[] orderedInputNodes; - private String[] orderedOutputNodes; - - private InferenceAdapter inferenceAdapter; - private JsonSerializer serializer; - private JsonDeserializer deserializer; - private BinarySerializer binarySerializer; - private BinaryDeserializer binaryDeserializer; - - private InputAdapter inputAdapter; - private OutputAdapter outputAdapter; - - private int port; - - private boolean parallelMode = true; - - // these fields actually require defaults - private InferenceMode inferenceMode = InferenceMode.BATCHED; - private int numWorkers = Nd4j.getAffinityManager().getNumberOfDevices(); - - public Builder(@NonNull SameDiff sdModel) { - this.sdModel = sdModel; - } - - public Builder(@NonNull MultiLayerNetwork mlnModel) { - this.mlnModel = mlnModel; - } - - public Builder(@NonNull ComputationGraph cgModel) { - this.cgModel = cgModel; - } - - public Builder(@NonNull ParallelInference pi) { - this.pi = pi; - } - - /** - * This method defines InferenceAdapter implementation, which will be used to convert object of Input type to the set of INDArray(s), and for conversion of resulting INDArray(s) into object of Output type - * @param inferenceAdapter - * @return - */ - public Builder inferenceAdapter(@NonNull InferenceAdapter inferenceAdapter) { - this.inferenceAdapter = inferenceAdapter; - return this; - } - - /** - * This method allows you to specify InputAdapter to be used for inference - * - * PLEASE NOTE: This method is optional, and will require OutputAdapter defined - * @param inputAdapter - * @return - */ - public Builder inputAdapter(@NonNull InputAdapter inputAdapter) { - this.inputAdapter = inputAdapter; - return this; - } - - /** - * This method allows you to specify OutputtAdapter to be used for inference - * - * PLEASE NOTE: This method is optional, and will require InputAdapter defined - * @param outputAdapter - * @return - */ - public Builder outputAdapter(@NonNull OutputAdapter outputAdapter) { - this.outputAdapter = outputAdapter; - return this; - } - - /** - * This method allows you to specify JSON serializer. - * Incompatible with {@link #outputBinarySerializer(BinarySerializer)} - * Only one serializer - deserializer pair can be used by client and server. - * - * @param serializer - * @return - */ - public Builder outputSerializer(@NonNull JsonSerializer serializer) { - this.serializer = serializer; - return this; - } - - /** - * This method allows you to specify JSON deserializer. - * Incompatible with {@link #inputBinaryDeserializer(BinaryDeserializer)} - * Only one serializer - deserializer pair can be used by client and server. - * - * @param deserializer - * @return - */ - public Builder inputDeserializer(@NonNull JsonDeserializer deserializer) { - this.deserializer = deserializer; - return this; - } - - /** - * This method allows you to specify binary serializer. - * Incompatible with {@link #outputSerializer(JsonSerializer)} - * Only one serializer - deserializer pair can be used by client and server. - * - * @param serializer - * @return - */ - public Builder outputBinarySerializer(@NonNull BinarySerializer serializer) { - this.binarySerializer = serializer; - return this; - } - - /** - * This method allows you to specify binary deserializer - * Incompatible with {@link #inputDeserializer(JsonDeserializer)} - * Only one serializer - deserializer pair can be used by client and server. - * - * @param deserializer - * @return - */ - public Builder inputBinaryDeserializer(@NonNull BinaryDeserializer deserializer) { - this.binaryDeserializer = deserializer; - return this; - } - - /** - * This method allows you to specify inference mode for parallel mode. See {@link InferenceMode} for more details - * - * @param inferenceMode - * @return - */ - public Builder inferenceMode(@NonNull InferenceMode inferenceMode) { - this.inferenceMode = inferenceMode; - return this; - } - - /** - * This method allows you to specify number of worker threads for ParallelInference - * - * @param numWorkers - * @return - */ - public Builder numWorkers(int numWorkers) { - this.numWorkers = numWorkers; - return this; - } - - /** - * This method allows you to specify the order in which the inputs should be mapped to the model placeholder arrays. This is only required for {@link SameDiff} models, not {@link MultiLayerNetwork} or {@link ComputationGraph} models - * - * PLEASE NOTE: this argument only used for SameDiff models - * @param args - * @return - */ - public Builder orderedInputNodes(String... args) { - orderedInputNodes = args; - return this; - } - - /** - * This method allows you to specify the order in which the inputs should be mapped to the model placeholder arrays. This is only required for {@link SameDiff} models, not {@link MultiLayerNetwork} or {@link ComputationGraph} models - * - * PLEASE NOTE: this argument only used for SameDiff models - * @param args - * @return - */ - public Builder orderedInputNodes(@NonNull List args) { - orderedInputNodes = args.toArray(new String[args.size()]); - return this; - } - - /** - * This method allows you to specify output nodes - * - * PLEASE NOTE: this argument only used for SameDiff models - * @param args - * @return - */ - public Builder orderedOutputNodes(String... args) { - Preconditions.checkArgument(args != null && args.length > 0, "OutputNodes should contain at least 1 element"); - orderedOutputNodes = args; - return this; - } - - /** - * This method allows you to specify output nodes - * - * PLEASE NOTE: this argument only used for SameDiff models - * @param args - * @return - */ - public Builder orderedOutputNodes(@NonNull List args) { - Preconditions.checkArgument(args.size() > 0, "OutputNodes should contain at least 1 element"); - orderedOutputNodes = args.toArray(new String[args.size()]); - return this; - } - - /** - * This method allows you to specify http port - * - * PLEASE NOTE: port must be free and be in range regular TCP/IP ports range - * @param port - * @return - */ - public Builder port(int port) { - this.port = port; - return this; - } - - /** - * This method switches on ParallelInference usage - * @param - true - to use ParallelInference, false - to use ComputationGraph or - * MultiLayerNetwork directly - * - * PLEASE NOTE: this doesn't apply to SameDiff models - * - * @throws Exception - */ - public Builder parallelMode(boolean enable) { - this.parallelMode = enable; - return this; - } - - public JsonModelServer build() { - if (inferenceAdapter == null) { - if (inputAdapter != null && outputAdapter != null) { - inferenceAdapter = new InferenceAdapter() { - @Override - public MultiDataSet apply(I input) { - return inputAdapter.apply(input); - } - - @Override - public O apply(INDArray... outputs) { - return outputAdapter.apply(outputs); - } - }; - } else - throw new IllegalArgumentException("Either InferenceAdapter or InputAdapter + OutputAdapter should be configured"); - } - - JsonModelServer server = null; - if (sdModel != null) { - server = new JsonModelServer(sdModel, inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port, orderedInputNodes, orderedOutputNodes); - } - else if (cgModel != null) { - server = new JsonModelServer(cgModel, inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port, inferenceMode, numWorkers); - } - else if (mlnModel != null) { - server = new JsonModelServer(mlnModel, inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port, inferenceMode, numWorkers); - } - else if (pi != null) { - server = new JsonModelServer(pi, inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port); - } - else - throw new IllegalStateException("No models were defined for JsonModelServer"); - - server.enabledParallel = parallelMode; - return server; - } - } - -} diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/AssertTestsExtendBaseClass.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/AssertTestsExtendBaseClass.java deleted file mode 100644 index 0480e4b95..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/AssertTestsExtendBaseClass.java +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.remote; - -import lombok.extern.slf4j.Slf4j; -import java.util.*; - -import org.deeplearning4j.BaseDL4JTest; -import org.nd4j.common.tests.AbstractAssertTestsClass; - -/** - * This class checks that all test classes (i.e., anything with one or more methods annotated with @Test) - * extends BaseDl4JTest - either directly or indirectly. - * Other than a small set of exceptions, all tests must extend this - * - * @author Alexander Stoyakin - */ -@Slf4j -public class AssertTestsExtendBaseClass extends AbstractAssertTestsClass { - - @Override - protected Set> getExclusions() { - Set> exclusions = new HashSet<>(); - return exclusions; - } - - @Override - protected String getPackageName() { - return "org.deeplearning4j.remote"; - } - - @Override - protected Class getBaseClass() { return BaseDL4JTest.class; } -} - diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/BinaryModelServerTest.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/BinaryModelServerTest.java deleted file mode 100644 index b2c519051..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/BinaryModelServerTest.java +++ /dev/null @@ -1,294 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote; - -import lombok.extern.slf4j.Slf4j; -import lombok.val; -import org.datavec.image.loader.Java2DNativeImageLoader; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.deeplearning4j.remote.helpers.ImageConversionUtils; -import org.deeplearning4j.util.ModelSerializer; -import org.junit.After; -import org.junit.Test; -import org.nd4j.adapters.InferenceAdapter; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.common.io.ClassPathResource; -import org.nd4j.remote.clients.JsonRemoteInference; -import org.nd4j.remote.clients.serde.BinaryDeserializer; -import org.nd4j.remote.clients.serde.BinarySerializer; -import org.nd4j.remote.clients.serde.impl.IntegerSerde; -import org.nd4j.common.resources.Resources; -import org.nd4j.shade.jackson.databind.ObjectMapper; - -import javax.imageio.ImageIO; -import java.awt.image.BufferedImage; -import java.io.*; -import java.util.concurrent.Future; -import java.util.concurrent.TimeUnit; - -import static org.deeplearning4j.parallelism.inference.InferenceMode.SEQUENTIAL; -import static org.junit.Assert.*; - -@Slf4j -public class BinaryModelServerTest extends BaseDL4JTest { - private final int PORT = 18080; - - @After - public void pause() throws Exception { - // TODO: the same port was used in previous test and not accessible immediately. Might be better solution. - TimeUnit.SECONDS.sleep(2); - } - - // Internal test for locally defined serializers - @Test - public void testBufferedImageSerde() { - BinarySerializer serde = new BinaryModelServerTest.BufferedImageSerde(); - BufferedImage image = ImageConversionUtils.makeRandomBufferedImage(28,28,1); - byte[] serialized = serde.serialize(image); - - BufferedImage deserialized = ((BufferedImageSerde) serde).deserialize(serialized); - int originalSize = image.getData().getDataBuffer().getSize(); - assertEquals(originalSize, deserialized.getData().getDataBuffer().getSize()); - for (int i = 0; i < originalSize; ++i) { - assertEquals(deserialized.getData().getDataBuffer().getElem(i), - image.getData().getDataBuffer().getElem(i)); - } - } - - @Test - public void testImageToINDArray() { - INDArray data = ImageConversionUtils.makeRandomImageAsINDArray(28,28,1); - assertNotNull(data); - } - - @Test - public void testMlnMnist_ImageInput() throws Exception { - - val modelFile = Resources.asFile("models/mnist/mnist-model.zip"); - MultiLayerNetwork net = ModelSerializer.restoreMultiLayerNetwork(modelFile); - - val server = new JsonModelServer.Builder(net) - .outputSerializer(new IntegerSerde()) - .inputBinaryDeserializer(new BufferedImageSerde()) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(BufferedImage input) { - INDArray data = null; - try { - data = new Java2DNativeImageLoader().asMatrix(input); - data = data.reshape(1, 784); - } - catch (IOException e) { - throw new RuntimeException(e); - } - return new MultiDataSet(data, null); - } - - @Override - public Integer apply(INDArray... nnOutput) { - return nnOutput[0].argMax().getInt(0); - } - }) - .port(PORT) - .inferenceMode(SEQUENTIAL) - .numWorkers(1) - .parallelMode(false) - .build(); - - val client = JsonRemoteInference.builder() - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .inputBinarySerializer(new BufferedImageSerde()) - .outputDeserializer(new IntegerSerde()) - .build(); - - try { - server.start(); - BufferedImage image = ImageConversionUtils.makeRandomBufferedImage(28,28,1); - Integer result = client.predict(image); - assertNotNull(result); - - File file = new ClassPathResource("datavec-local/imagetest/0/b.bmp").getFile(); - image = ImageIO.read(new FileInputStream(file)); - result = client.predict(image); - assertEquals(new Integer(0), result); - - file = new ClassPathResource("datavec-local/imagetest/1/a.bmp").getFile(); - image = ImageIO.read(new FileInputStream(file)); - result = client.predict(image); - assertEquals(new Integer(1), result); - - } catch (Exception e){ - log.error("",e); - throw e; - } finally { - server.stop(); - } - } - - @Test - public void testMlnMnist_ImageInput_Async() throws Exception { - - val modelFile = Resources.asFile("models/mnist/mnist-model.zip"); - MultiLayerNetwork net = ModelSerializer.restoreMultiLayerNetwork(modelFile); - - val server = new JsonModelServer.Builder(net) - .outputSerializer(new IntegerSerde()) - .inputBinaryDeserializer(new BufferedImageSerde()) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(BufferedImage input) { - INDArray data = null; - try { - data = new Java2DNativeImageLoader().asMatrix(input); - data = data.reshape(1, 784); - } - catch (IOException e) { - throw new RuntimeException(e); - } - return new MultiDataSet(data, null); - } - - @Override - public Integer apply(INDArray... nnOutput) { - return nnOutput[0].argMax().getInt(0); - } - }) - .port(PORT) - .inferenceMode(SEQUENTIAL) - .numWorkers(1) - .parallelMode(false) - .build(); - - val client = JsonRemoteInference.builder() - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .inputBinarySerializer(new BufferedImageSerde()) - .outputDeserializer(new IntegerSerde()) - .build(); - - try { - server.start(); - BufferedImage[] images = new BufferedImage[3]; - images[0] = ImageConversionUtils.makeRandomBufferedImage(28,28,1); - - File file = new ClassPathResource("datavec-local/imagetest/0/b.bmp").getFile(); - images[1] = ImageIO.read(new FileInputStream(file)); - - file = new ClassPathResource("datavec-local/imagetest/1/a.bmp").getFile(); - images[2] = ImageIO.read(new FileInputStream(file)); - - Future[] results = new Future[3]; - for (int i = 0; i < images.length; ++i) { - results[i] = client.predictAsync(images[i]); - assertNotNull(results[i]); - } - - assertNotNull(results[0].get()); - assertEquals(new Integer(0), results[1].get()); - assertEquals(new Integer(1), results[2].get()); - - } catch (Exception e){ - log.error("",e); - throw e; - } finally { - server.stop(); - } - } - - @Test - public void testBinaryIn_BinaryOut() throws Exception { - - val modelFile = Resources.asFile("models/mnist/mnist-model.zip"); - MultiLayerNetwork net = ModelSerializer.restoreMultiLayerNetwork(modelFile); - - val server = new JsonModelServer.Builder(net) - .outputBinarySerializer(new BufferedImageSerde()) - .inputBinaryDeserializer(new BufferedImageSerde()) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(BufferedImage input) { - INDArray data = null; - try { - data = new Java2DNativeImageLoader().asMatrix(input); - } - catch (IOException e) { - throw new RuntimeException(e); - } - return new MultiDataSet(data, null); - } - - @Override - public BufferedImage apply(INDArray... nnOutput) { - return ImageConversionUtils.makeRandomBufferedImage(28,28,3); - } - }) - .port(PORT) - .inferenceMode(SEQUENTIAL) - .numWorkers(1) - .parallelMode(false) - .build(); - - val client = JsonRemoteInference.builder() - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .inputBinarySerializer(new BufferedImageSerde()) - .outputBinaryDeserializer(new BufferedImageSerde()) - .build(); - - try { - server.start(); - BufferedImage image = ImageConversionUtils.makeRandomBufferedImage(28,28,1); - BufferedImage result = client.predict(image); - assertNotNull(result); - assertEquals(28, result.getHeight()); - assertEquals(28, result.getWidth()); - - } catch (Exception e){ - log.error("",e); - throw e; - } finally { - server.stop(); - } - } - - private static class BufferedImageSerde implements BinarySerializer, BinaryDeserializer { - - @Override - public BufferedImage deserialize(byte[] buffer) { - try { - BufferedImage img = ImageIO.read(new ByteArrayInputStream(buffer)); - return img; - } catch (IOException e){ - throw new RuntimeException(e); - } - } - - @Override - public byte[] serialize(BufferedImage image) { - try{ - val baos = new ByteArrayOutputStream(); - ImageIO.write(image, "bmp", baos); - byte[] bytes = baos.toByteArray(); - return bytes; - } catch (IOException e){ - throw new RuntimeException(e); - } - } - } -} diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/JsonModelServerTest.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/JsonModelServerTest.java deleted file mode 100644 index 9ab3448a1..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/JsonModelServerTest.java +++ /dev/null @@ -1,761 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote; - -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.NoArgsConstructor; -import lombok.extern.slf4j.Slf4j; -import lombok.val; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.nn.conf.ComputationGraphConfiguration; -import org.deeplearning4j.nn.conf.MultiLayerConfiguration; -import org.deeplearning4j.nn.conf.NeuralNetConfiguration; -import org.deeplearning4j.nn.conf.graph.MergeVertex; -import org.deeplearning4j.nn.conf.layers.*; -import org.deeplearning4j.nn.graph.ComputationGraph; -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork; -import org.deeplearning4j.nn.weights.WeightInit; -import org.deeplearning4j.parallelism.inference.InferenceMode; -import org.deeplearning4j.remote.helpers.House; -import org.deeplearning4j.remote.helpers.HouseToPredictedPriceAdapter; -import org.deeplearning4j.remote.helpers.PredictedPrice; -import org.junit.After; -import org.junit.Before; -import org.junit.Test; -import org.nd4j.adapters.InferenceAdapter; -import org.nd4j.autodiff.samediff.SDVariable; -import org.nd4j.autodiff.samediff.SameDiff; -import org.nd4j.linalg.activations.Activation; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.linalg.learning.config.Adam; -import org.nd4j.linalg.learning.config.Sgd; -import org.nd4j.linalg.lossfunctions.LossFunctions; -import org.nd4j.remote.clients.JsonRemoteInference; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; -import org.nd4j.shade.jackson.databind.ObjectMapper; - - -import java.io.IOException; -import java.util.Collections; -import java.util.concurrent.ExecutionException; -import java.util.concurrent.Future; -import java.util.concurrent.TimeUnit; -import java.util.concurrent.atomic.AtomicInteger; - -import static org.deeplearning4j.parallelism.inference.InferenceMode.INPLACE; -import static org.deeplearning4j.parallelism.inference.InferenceMode.SEQUENTIAL; -import static org.junit.Assert.*; - -@Slf4j -public class JsonModelServerTest extends BaseDL4JTest { - private static final MultiLayerNetwork model; - - static { - val conf = new NeuralNetConfiguration.Builder() - .seed(119) - .updater(new Adam(0.119f)) - .weightInit(WeightInit.XAVIER) - .list() - .layer(0, new DenseLayer.Builder().activation(Activation.TANH).nIn(4).nOut(10).build()) - .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.SQUARED_LOSS).activation(Activation.SIGMOID).nIn(10).nOut(1).build()) - .build(); - - model = new MultiLayerNetwork(conf); - model.init(); - } - - @After - public void pause() throws Exception { - // Need to wait for server shutdown; without sleep, tests will fail if starting immediately after shutdown - TimeUnit.SECONDS.sleep(2); - } - - private AtomicInteger portCount = new AtomicInteger(18080); - private int PORT; - - @Before - public void setPort(){ - PORT = portCount.getAndIncrement(); - } - - - @Test - public void testStartStopParallel() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 1,4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val serverDL = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .numWorkers(1) - .inferenceMode(SEQUENTIAL) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .port(PORT) - .build(); - - val serverSD = new JsonModelServer.Builder(sd) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .orderedInputNodes(new String[]{"input"}) - .orderedOutputNodes(new String[]{"total"}) - .port(PORT+1) - .build(); - try { - serverDL.start(); - serverSD.start(); - - val clientDL = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new PredictedPrice.PredictedPriceDeserializer()) - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - PredictedPrice price = clientDL.predict(house); - long timeStart = System.currentTimeMillis(); - price = clientDL.predict(house); - long timeStop = System.currentTimeMillis(); - log.info("Time spent: {} ms", timeStop - timeStart); - assertNotNull(price); - assertEquals((float) 0.421444, price.getPrice(), 1e-5); - - val clientSD = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new PredictedPrice.PredictedPriceDeserializer()) - .endpointAddress("http://localhost:" + (PORT+1) + "/v1/serving") - .build(); - - PredictedPrice price2 = clientSD.predict(house); - timeStart = System.currentTimeMillis(); - price = clientSD.predict(house); - timeStop = System.currentTimeMillis(); - log.info("Time spent: {} ms", timeStop - timeStart); - assertNotNull(price); - assertEquals((float) 3.0, price.getPrice(), 1e-5); - - } - finally { - serverSD.stop(); - serverDL.stop(); - } - } - - @Test - public void testStartStopSequential() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 1,4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val serverDL = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .numWorkers(1) - .inferenceMode(SEQUENTIAL) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .port(PORT) - .build(); - - val serverSD = new JsonModelServer.Builder(sd) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .orderedInputNodes(new String[]{"input"}) - .orderedOutputNodes(new String[]{"total"}) - .port(PORT+1) - .build(); - - serverDL.start(); - serverDL.stop(); - - serverSD.start(); - serverSD.stop(); - } - - @Test - public void basicServingTestForSD() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 1,4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val server = new JsonModelServer.Builder(sd) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .orderedInputNodes(new String[]{"input"}) - .orderedOutputNodes(new String[]{"total"}) - .port(PORT) - .build(); - - try { - server.start(); - - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new PredictedPrice.PredictedPriceDeserializer()) - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - // warmup - PredictedPrice price = client.predict(house); - - val timeStart = System.currentTimeMillis(); - price = client.predict(house); - val timeStop = System.currentTimeMillis(); - - log.info("Time spent: {} ms", timeStop - timeStart); - - assertNotNull(price); - assertEquals((float) district + 1.0f, price.getPrice(), 1e-5); - } - finally { - server.stop(); - } - } - - @Test - public void basicServingTestForDLSynchronized() throws Exception { - val server = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .numWorkers(1) - .inferenceMode(INPLACE) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .port(PORT) - .build(); - - try { - server.start(); - - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new PredictedPrice.PredictedPriceDeserializer()) - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .build(); - - int district = 2; - House house1 = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - House house2 = House.builder().area(50).bathrooms(1).bedrooms(2).district(district).build(); - House house3 = House.builder().area(80).bathrooms(1).bedrooms(3).district(district).build(); - - // warmup - PredictedPrice price = client.predict(house1); - - val timeStart = System.currentTimeMillis(); - PredictedPrice price1 = client.predict(house1); - PredictedPrice price2 = client.predict(house2); - PredictedPrice price3 = client.predict(house3); - val timeStop = System.currentTimeMillis(); - - log.info("Time spent: {} ms", timeStop - timeStart); - - assertNotNull(price); - assertEquals((float) 0.421444, price.getPrice(), 1e-5); - - } finally { - server.stop(); - } - } - - @Test - public void basicServingTestForDL() throws Exception { - - val server = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .numWorkers(1) - .inferenceMode(SEQUENTIAL) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .port(PORT) - .parallelMode(false) - .build(); - - try { - server.start(); - - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new PredictedPrice.PredictedPriceDeserializer()) - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - // warmup - PredictedPrice price = client.predict(house); - - val timeStart = System.currentTimeMillis(); - price = client.predict(house); - val timeStop = System.currentTimeMillis(); - - log.info("Time spent: {} ms", timeStop - timeStart); - - assertNotNull(price); - assertEquals((float) 0.421444, price.getPrice(), 1e-5); - - } finally { - server.stop(); - } - } - - @Test - public void testDeserialization_1() { - String request = "{\"bedrooms\":3,\"area\":100,\"district\":2,\"bathrooms\":2}"; - val deserializer = new House.HouseDeserializer(); - val result = deserializer.deserialize(request); - assertEquals(2, result.getDistrict()); - assertEquals(100, result.getArea()); - assertEquals(2, result.getBathrooms()); - assertEquals(3, result.getBedrooms()); - - } - - @Test - public void testDeserialization_2() { - String request = "{\"price\":1}"; - val deserializer = new PredictedPrice.PredictedPriceDeserializer(); - val result = deserializer.deserialize(request); - assertEquals(1.0, result.getPrice(), 1e-4); - } - - @Test(expected = NullPointerException.class) - public void negativeServingTest_1() throws Exception { - - val server = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(null) - .port(PORT) - .build(); - } - - @Test //(expected = NullPointerException.class) - public void negativeServingTest_2() throws Exception { - - val server = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .port(PORT) - .build(); - - } - - @Test(expected = IOException.class) - public void negativeServingTest_3() throws Exception { - - val server = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .inferenceMode(SEQUENTIAL) - .numWorkers(1) - .port(PORT) - .build(); - - try { - server.start(); - - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new JsonDeserializer() { - @Override - public PredictedPrice deserialize(String json) { - return null; - } - }) - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - // warmup - PredictedPrice price = client.predict(house); - } finally { - server.stop(); - } - } - - @Test - public void asyncServingTest() throws Exception { - - val server = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .inferenceMode(SEQUENTIAL) - .numWorkers(1) - .port(PORT) - .build(); - - try { - server.start(); - - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new PredictedPrice.PredictedPriceDeserializer()) - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - val timeStart = System.currentTimeMillis(); - Future price = client.predictAsync(house); - assertNotNull(price); - assertEquals((float) 0.421444, price.get().getPrice(), 1e-5); - val timeStop = System.currentTimeMillis(); - - log.info("Time spent: {} ms", timeStop - timeStart); - } - finally { - server.stop(); - } - } - - @Test - public void negativeAsyncTest() throws Exception { - - val server = new JsonModelServer.Builder(model) - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .inferenceMode(InferenceMode.BATCHED) - .numWorkers(1) - .port(PORT) - .build(); - - try { - server.start(); - - // Fake deserializer to test failure - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new JsonDeserializer() { - @Override - public PredictedPrice deserialize(String json) { - return null; - } - }) - .endpointAddress("http://localhost:" + PORT + "/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - val timeStart = System.currentTimeMillis(); - try { - Future price = client.predictAsync(house); - assertNotNull(price); - assertEquals((float) district + 1.0f, price.get().getPrice(), 1e-5); - val timeStop = System.currentTimeMillis(); - - log.info("Time spent: {} ms", timeStop - timeStart); - } catch (ExecutionException e) { - assertTrue(e.getMessage().contains("Deserialization failed")); - } - } finally { - server.stop(); - } - } - - - @Test - public void testSameDiffMnist() throws Exception { - - SameDiff sd = SameDiff.create(); - SDVariable in = sd.placeHolder("in", DataType.FLOAT, -1, 28*28); - SDVariable w = sd.var("w", Nd4j.rand(DataType.FLOAT, 28*28, 10)); - SDVariable b = sd.var("b", Nd4j.rand(DataType.FLOAT, 1, 10)); - SDVariable sm = sd.nn.softmax("softmax", in.mmul(w).add(b), -1); - - val server = new JsonModelServer.Builder(sd) - .outputSerializer( new IntSerde()) - .inputDeserializer(new FloatSerde()) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(float[] input) { - return new MultiDataSet(Nd4j.create(input, 1, input.length), null); - } - - @Override - public Integer apply(INDArray... nnOutput) { - return nnOutput[0].argMax().getInt(0); - } - }) - .orderedInputNodes("in") - .orderedOutputNodes("softmax") - .port(PORT+1) - .build(); - - val client = JsonRemoteInference.builder() - .endpointAddress("http://localhost:" + (PORT+1) + "/v1/serving") - .outputDeserializer(new IntSerde()) - .inputSerializer( new FloatSerde()) - .build(); - - try{ - server.start(); - for( int i=0; i<10; i++ ){ - INDArray f = Nd4j.rand(DataType.FLOAT, 1, 28*28); - INDArray exp = sd.output(Collections.singletonMap("in", f), "softmax").get("softmax"); - float[] fArr = f.toFloatVector(); - int out = client.predict(fArr); - assertEquals(exp.argMax().getInt(0), out); - } - } finally { - server.stop(); - } - } - - @Test - public void testMlnMnist() throws Exception { - - MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder() - .list() - .layer(new DenseLayer.Builder().nIn(784).nOut(10).build()) - .layer(new LossLayer.Builder().activation(Activation.SOFTMAX).build()) - .build(); - - MultiLayerNetwork net = new MultiLayerNetwork(conf); - net.init(); - - val server = new JsonModelServer.Builder(net) - .outputSerializer( new IntSerde()) - .inputDeserializer(new FloatSerde()) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(float[] input) { - return new MultiDataSet(Nd4j.create(input, 1, input.length), null); - } - - @Override - public Integer apply(INDArray... nnOutput) { - return nnOutput[0].argMax().getInt(0); - } - }) - .orderedInputNodes("in") - .orderedOutputNodes("softmax") - .port(PORT + 1) - .inferenceMode(SEQUENTIAL) - .numWorkers(2) - .build(); - - val client = JsonRemoteInference.builder() - .endpointAddress("http://localhost:" + (PORT + 1) + "/v1/serving") - .outputDeserializer(new IntSerde()) - .inputSerializer( new FloatSerde()) - .build(); - - try { - server.start(); - for (int i = 0; i < 10; i++) { - INDArray f = Nd4j.rand(DataType.FLOAT, 1, 28 * 28); - INDArray exp = net.output(f); - float[] fArr = f.toFloatVector(); - int out = client.predict(fArr); - assertEquals(exp.argMax().getInt(0), out); - } - } catch (Exception e){ - log.error("",e); - throw e; - } finally { - server.stop(); - } - } - - @Test - public void testCompGraph() throws Exception { - - ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder() - .graphBuilder() - .addInputs("input1", "input2") - .addLayer("L1", new DenseLayer.Builder().nIn(3).nOut(4).build(), "input1") - .addLayer("L2", new DenseLayer.Builder().nIn(3).nOut(4).build(), "input2") - .addVertex("merge", new MergeVertex(), "L1", "L2") - .addLayer("out", new OutputLayer.Builder().nIn(4+4).nOut(3).build(), "merge") - .setOutputs("out") - .build(); - - ComputationGraph net = new ComputationGraph(conf); - net.init(); - - val server = new JsonModelServer.Builder(net) - .outputSerializer( new IntSerde()) - .inputDeserializer(new FloatSerde()) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(float[] input) { - return new MultiDataSet(Nd4j.create(input, 1, input.length), null); - } - - @Override - public Integer apply(INDArray... nnOutput) { - return nnOutput[0].argMax().getInt(0); - } - }) - .orderedInputNodes("in") - .orderedOutputNodes("softmax") - .port(PORT + 1) - .inferenceMode(SEQUENTIAL) - .numWorkers(2) - .parallelMode(false) - .build(); - - val client = JsonRemoteInference.builder() - .endpointAddress("http://localhost:" + (PORT + 1) + "/v1/serving") - .outputDeserializer(new IntSerde()) - .inputSerializer( new FloatSerde()) - .build(); - - try { - server.start(); - //client.predict(new float[]{0.0f, 1.0f, 2.0f}); - } catch (Exception e){ - log.error("",e); - throw e; - } finally { - server.stop(); - } - } - - @Test - public void testCompGraph_1() throws Exception { - - ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder() - .updater(new Sgd(0.01)) - .graphBuilder() - .addInputs("input") - .addLayer("L1", new DenseLayer.Builder().nIn(8).nOut(4).build(), "input") - .addLayer("out1", new OutputLayer.Builder() - .lossFunction(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD) - .nIn(4).nOut(3).build(), "L1") - .addLayer("out2", new OutputLayer.Builder() - .lossFunction(LossFunctions.LossFunction.MSE) - .nIn(4).nOut(2).build(), "L1") - .setOutputs("out1","out2") - .build(); - - final ComputationGraph net = new ComputationGraph(conf); - net.init(); - - val server = new JsonModelServer.Builder(net) - .outputSerializer( new IntSerde()) - .inputDeserializer(new FloatSerde()) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(float[] input) { - return new MultiDataSet(Nd4j.create(input, 1, input.length), null); - } - - @Override - public Integer apply(INDArray... nnOutput) { - return nnOutput[0].argMax().getInt(0); - } - }) - .orderedInputNodes("input") - .orderedOutputNodes("out") - .port(PORT + 1) - .inferenceMode(SEQUENTIAL) - .numWorkers(2) - .parallelMode(false) - .build(); - - val client = JsonRemoteInference.builder() - .endpointAddress("http://localhost:" + (PORT + 1) + "/v1/serving") - .outputDeserializer(new IntSerde()) - .inputSerializer( new FloatSerde()) - .build(); - - try { - server.start(); - val result = client.predict(new float[]{0.0f, 1.0f, 2.0f, 3.0f, 4.0f, 5.0f, 6.0f, 7.0f}); - assertNotNull(result); - } catch (Exception e){ - log.error("",e); - throw e; - } finally { - server.stop(); - } - } - - private static class FloatSerde implements JsonSerializer, JsonDeserializer{ - private final ObjectMapper om = new ObjectMapper(); - - @Override - public float[] deserialize(String json) { - try { - return om.readValue(json, FloatHolder.class).getFloats(); - } catch (IOException e){ - throw new RuntimeException(e); - } - } - - @Override - public String serialize(float[] o) { - try{ - return om.writeValueAsString(new FloatHolder(o)); - } catch (IOException e){ - throw new RuntimeException(e); - } - } - - //Use float holder so Jackson does ser/de properly (no "{}" otherwise) - @AllArgsConstructor @NoArgsConstructor @Data - private static class FloatHolder { - private float[] floats; - } - } - - private static class IntSerde implements JsonSerializer, JsonDeserializer { - private final ObjectMapper om = new ObjectMapper(); - - @Override - public Integer deserialize(String json) { - try { - return om.readValue(json, Integer.class); - } catch (IOException e){ - throw new RuntimeException(e); - } - } - - @Override - public String serialize(Integer o) { - try{ - return om.writeValueAsString(o); - } catch (IOException e){ - throw new RuntimeException(e); - } - } - } -} \ No newline at end of file diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/ServletTest.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/ServletTest.java deleted file mode 100644 index 802aa18dd..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/ServletTest.java +++ /dev/null @@ -1,136 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote; - -import lombok.val; -import org.apache.http.client.methods.HttpGet; -import org.apache.http.client.methods.HttpPost; -import org.apache.http.impl.client.HttpClientBuilder; -import org.deeplearning4j.BaseDL4JTest; -import org.junit.After; -import org.junit.Before; -import org.junit.Test; -import org.nd4j.adapters.InferenceAdapter; -import org.nd4j.autodiff.samediff.SameDiff; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; - -import java.io.IOException; - -import static org.junit.Assert.assertEquals; - -public class ServletTest extends BaseDL4JTest { - - private JsonModelServer server; - - @Before - public void setUp() throws Exception { - val sd = SameDiff.create(); - server = new JsonModelServer.Builder(sd) - .port(8080) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(String input) { - return null; - } - - @Override - public String apply(INDArray... nnOutput) { - return null; - } - }) - .outputSerializer(new JsonSerializer() { - @Override - public String serialize(String o) { - return ""; - } - }) - .inputDeserializer(new JsonDeserializer() { - @Override - public String deserialize(String json) { - return ""; - } - }) - .orderedInputNodes("input") - .orderedOutputNodes("output") - .build(); - - server.start(); - //server.join(); - } - - @After - public void tearDown() throws Exception { - server.stop(); - } - - @Test - public void getEndpoints() throws IOException { - val request = new HttpGet( "http://localhost:8080/v1" ); - request.setHeader("Content-type", "application/json"); - - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(200, response.getStatusLine().getStatusCode()); - } - - @Test - public void testContentTypeGet() throws IOException { - val request = new HttpGet( "http://localhost:8080/v1" ); - request.setHeader("Content-type", "text/plain"); - - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(415, response.getStatusLine().getStatusCode()); - } - - @Test - public void testContentTypePost() throws Exception { - val request = new HttpPost("http://localhost:8080/v1/serving"); - request.setHeader("Content-type", "text/plain"); - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(415, response.getStatusLine().getStatusCode()); - } - - @Test - public void postForServing() throws Exception { - val request = new HttpPost("http://localhost:8080/v1/serving"); - request.setHeader("Content-type", "application/json"); - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(500, response.getStatusLine().getStatusCode()); - } - - @Test - public void testNotFoundPost() throws Exception { - val request = new HttpPost("http://localhost:8080/v1/serving/some"); - request.setHeader("Content-type", "application/json"); - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(404, response.getStatusLine().getStatusCode()); - } - - @Test - public void testNotFoundGet() throws Exception { - val requestGet = new HttpGet( "http://localhost:8080/v1/not_found" ); - requestGet.setHeader("Content-type", "application/json"); - - val responseGet = HttpClientBuilder.create().build().execute( requestGet ); - assertEquals(404, responseGet.getStatusLine().getStatusCode()); - } - -} diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/House.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/House.java deleted file mode 100644 index d54e9f3ba..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/House.java +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote.helpers; - -import com.google.gson.Gson; -import lombok.*; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; - -@Data -@Builder -@AllArgsConstructor -@NoArgsConstructor -public class House { - private int district; - private int bedrooms; - private int bathrooms; - private int area; - - - public static class HouseSerializer implements JsonSerializer { - @Override - public String serialize(@NonNull House o) { - return new Gson().toJson(o); - } - } - - public static class HouseDeserializer implements JsonDeserializer { - @Override - public House deserialize(@NonNull String json) { - return new Gson().fromJson(json, House.class); - } - } -} diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/HouseToPredictedPriceAdapter.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/HouseToPredictedPriceAdapter.java deleted file mode 100644 index 71bca43d3..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/HouseToPredictedPriceAdapter.java +++ /dev/null @@ -1,42 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote.helpers; - -import lombok.NonNull; -import lombok.extern.slf4j.Slf4j; -import org.nd4j.adapters.InferenceAdapter; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.linalg.factory.Nd4j; - -@Slf4j -public class HouseToPredictedPriceAdapter implements InferenceAdapter { - - @Override - public MultiDataSet apply(@NonNull House input) { - // we just create vector array with shape[4] and assign it's value to the district value - return new MultiDataSet(Nd4j.create(DataType.FLOAT, 1, 4).assign(input.getDistrict()), null); - } - - @Override - public PredictedPrice apply(INDArray... nnOutput) { - return new PredictedPrice(nnOutput[0].getFloat(0)); - } -} diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/ImageConversionUtils.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/ImageConversionUtils.java deleted file mode 100644 index 0b5e15884..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/ImageConversionUtils.java +++ /dev/null @@ -1,100 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote.helpers; - -import lombok.val; -import org.bytedeco.javacpp.indexer.UByteIndexer; -import org.bytedeco.javacv.Java2DFrameConverter; -import org.bytedeco.javacv.OpenCVFrameConverter; -import org.bytedeco.opencv.opencv_core.Mat; -import org.datavec.image.loader.Java2DNativeImageLoader; -import org.datavec.image.loader.NativeImageLoader; -import org.nd4j.linalg.api.ndarray.INDArray; - -import java.awt.image.BufferedImage; -import java.io.IOException; -import java.util.Random; - -import static org.bytedeco.opencv.global.opencv_core.CV_8UC; - -public class ImageConversionUtils { - - public static Mat makeRandomImage(int height, int width, int channels) { - if (height <= 0) { - - height = new Random().nextInt() % 100 + 100; - } - if (width <= 0) { - width = new Random().nextInt() % 100 + 100; - } - - Mat img = new Mat(height, width, CV_8UC(channels)); - UByteIndexer idx = img.createIndexer(); - for (int i = 0; i < height; i++) { - for (int j = 0; j < width; j++) { - for (int k = 0; k < channels; k++) { - idx.put(i, j, k, new Random().nextInt()); - } - } - } - return img; - } - - public static BufferedImage makeRandomBufferedImage(int height, int width, int channels) { - Mat img = makeRandomImage(height, width, channels); - - OpenCVFrameConverter.ToMat c = new OpenCVFrameConverter.ToMat(); - Java2DFrameConverter c2 = new Java2DFrameConverter(); - - return c2.convert(c.convert(img)); - } - - public static INDArray convert(BufferedImage image) { - INDArray retVal = null; - try { - retVal = new Java2DNativeImageLoader(image.getHeight(), image.getWidth(), image.getRaster().getNumBands()). - asRowVector(image); - } - catch (IOException e) { - throw new RuntimeException(e); - } - return retVal; - } - - public static INDArray convert(Mat image) { - INDArray retVal = null; - try { - new NativeImageLoader().asRowVector(image); - } - catch (IOException e) { - throw new RuntimeException(e); - } - return retVal; - } - - public static BufferedImage convert(INDArray input) { - return new Java2DNativeImageLoader(input.rows(),input.columns()).asBufferedImage(input); - } - - public static INDArray makeRandomImageAsINDArray(int height, int width, int channels) { - val image = makeRandomBufferedImage(height, width, channels); - INDArray retVal = convert(image); - return retVal; - } -} diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/PredictedPrice.java b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/PredictedPrice.java deleted file mode 100644 index 2b12191b9..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/java/org/deeplearning4j/remote/helpers/PredictedPrice.java +++ /dev/null @@ -1,49 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.remote.helpers; - -import com.google.gson.Gson; -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.NoArgsConstructor; -import lombok.NonNull; -import lombok.extern.slf4j.Slf4j; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; - -@Data -@AllArgsConstructor -@NoArgsConstructor -public class PredictedPrice { - private float price; - - public static class PredictedPriceSerializer implements JsonSerializer { - @Override - public String serialize(@NonNull PredictedPrice o) { - return new Gson().toJson(o); - } - } - - public static class PredictedPriceDeserializer implements JsonDeserializer { - @Override - public PredictedPrice deserialize(@NonNull String json) { - return new Gson().fromJson(json, PredictedPrice.class); - } - } -} diff --git a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/resources/logback.xml b/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/resources/logback.xml deleted file mode 100644 index 27e08c0d5..000000000 --- a/contrib/attic/deeplearning4j-remote/deeplearning4j-json-server/src/test/resources/logback.xml +++ /dev/null @@ -1,52 +0,0 @@ - - - - - - - - logs/application.log - - %logger{15} - %message%n%xException{5} - - - - - - - %logger{15} - %message%n%xException{5} - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/contrib/attic/deeplearning4j-remote/pom.xml b/contrib/attic/deeplearning4j-remote/pom.xml deleted file mode 100644 index 54f5d3e8c..000000000 --- a/contrib/attic/deeplearning4j-remote/pom.xml +++ /dev/null @@ -1,54 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j-parent - 1.0.0-SNAPSHOT - - - deeplearning4j-remote - pom - - deeplearning4j-remote - - - deeplearning4j-json-server - - - - - testresources - - - test-nd4j-native - - - test-nd4j-cuda-11.0 - - - diff --git a/contrib/attic/jumpy/.gitignore b/contrib/attic/jumpy/.gitignore deleted file mode 100644 index 15f456096..000000000 --- a/contrib/attic/jumpy/.gitignore +++ /dev/null @@ -1,68 +0,0 @@ -# Byte-compiled / optimized / DLL files -__pycache__/ -*.py[cod] -*$py.class - -# C extensions -*.so - -# Distribution / packaging -.Python -env/ -build/ -develop-eggs/ -dist/ -downloads/ -eggs/ -.eggs/ -lib/ -lib64/ -parts/ -sdist/ -var/ -*.egg-info/ -.installed.cfg -*.egg - -# PyInstaller -# Usually these files are written by a python script from a template -# before PyInstaller builds the exe, so as to inject date/other infos into it. -*.manifest -*.spec - -# Installer logs -pip-log.txt -pip-delete-this-directory.txt - -# Unit test / coverage reports -htmlcov/ -.tox/ -.coverage -.coverage.* -.cache -nosetests.xml -coverage.xml -*,cover -.hypothesis/ - -# Translations -*.mo -*.pot - -# Django stuff: -*.log - -# Sphinx documentation -docs/_build/ - -# PyBuilder -target/ - -#Ipython Notebook -.ipynb_checkpoints - -# IDE settings -.idea/ - -.pytest_cache/ -venv/ diff --git a/contrib/attic/jumpy/README.md b/contrib/attic/jumpy/README.md deleted file mode 100644 index b18179024..000000000 --- a/contrib/attic/jumpy/README.md +++ /dev/null @@ -1,73 +0,0 @@ -Jumpy: Python interface for [nd4j](https://nd4j.org) -=========================================== - -[![Join the chat at https://gitter.im/deeplearning4j/deeplearning4j](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/deeplearning4j/deeplearning4j?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) -[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://github.com/eclipse/deeplearning4j/blob/master/jumpy/LICENSE) -[![PyPI version](https://badge.fury.io/py/jumpy.svg)](https://badge.fury.io/py/jumpy) - -Jumpy allows you to use ND4J from Python _without any network communication_. Many other Python libraries bridging Java -have considerable overhead, jumpy uses pointers to directly access your numpy arrays. Under the hood, Jumpy uses `pydl4j` -for dependency management and `pyjnius` to load Java classes. - -## Installation - -Jumpy is on PyPI, simply install it with - -```bash -pip install jumpy -``` - -or build it from source: - -```bash -python setup.py install -``` - -## Using Jumpy - -### Creating arrays - -Just like numpy, you can initialize an array using `.zeros()` or `.ones()` - -```python -import jumpy as jp - -x = jp.zeros((32, 16)) -y = jp.ones((32, 16)) -``` - -### Converting numpy array to jumpy array - -A numpy `ndarray` instance can be converted to a jumpy `ndarray` instance (and vice-versa) without copying the data - -```python -import jumpy as jp -import numpy as np - -x_np = np.random.random((100, 50)) -x_jp = jp.array(x_np) -``` - -### Converting jumpy array to numpy array - -Simply call the `.numpy()` method of `jumpy.ndarray.ndarray` - -```python -import jumpy as jp - -x_jp = jp.zeros((100,50)) -x_np = x_jp.numpy() -``` - -### Operations - -* Basic operators like `+` `-` `*` `/` `+=` `-=` `*=` `/=` are overloaded and broadcasting is supported. -* Indexing, slicing and assignment behaviour has been made as close to numpy as possible. -* Check `jumpy/ops/` to see available ops. - ---- -## Contribute - -* Check for open issues, or open a new issue to start a discussion around a feature idea or a bug. -* We could use more ops! Have a look at available ops (`jumpy/ops/`), it's quite easy to add new ones. -* Send a pull request and bug us on Gitter until it gets merged and published. :) diff --git a/contrib/attic/jumpy/benchmarks/benchmark.py b/contrib/attic/jumpy/benchmarks/benchmark.py deleted file mode 100644 index 2ab299f54..000000000 --- a/contrib/attic/jumpy/benchmarks/benchmark.py +++ /dev/null @@ -1,85 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import jumpy as jp -import numpy as np -from random import randint -import timeit -import gc - -gc.disable() -jp.disable_gc() - - -class Benchmark(object): - def __init__(self, n=1000): - print 'Running tests with [', n, 'x', n, '] dimensionality' - self.n = n - self.m = 200 - self.np_arr = [] - self.nd4j_arr = [] - for counter in range(0, self.m + 1): - self.np_arr.append(np.linspace(1, n * n, n * n).reshape((n, n))) - - for counter in range(0, self.m + 1): - self.nd4j_arr.append(jp.array(self.np_arr[counter])) - - def run_nd4j_scalar(self): - self.nd4j_arr[randint(0, self.m)] += 1.0172 - - def run_numpy_scalar(self): - self.np_arr[randint(0, self.m)] += 1.0172 - - def run_nd4j_add(self): - self.nd4j_arr[randint(0, self.m)] += self.nd4j_arr[randint(0, self.m)] - - def run_numpy_add(self): - self.np_arr[randint(0, self.m)] += self.np_arr[randint(0, self.m)] - - def run_numpy_sub(self): - self.np_arr[randint(0, self.m)] -= self.np_arr[randint(0, self.m)] - - def run_nd4j_sub(self): - self.nd4j_arr[randint(0, self.m)] -= self.nd4j_arr[randint(0, self.m)] - - def run_nd4j_mmul(self): - jp.dot(self.nd4j_arr[randint(0, self.m)], self.nd4j_arr[randint(0, self.m)]) - - def run_numpy_mmul(self): - np.dot(self.np_arr[randint(0, self.m)], self.np_arr[randint(0, self.m)]) - - def run_benchmark(self, n_trials=1000): - print 'nd4j scalar ', timeit.timeit(self.run_nd4j_scalar, number=n_trials) - print 'numpy scalar ', timeit.timeit(self.run_numpy_scalar, number=n_trials) - print 'nd4j add ', timeit.timeit(self.run_nd4j_add, number=n_trials) - print 'numpy add ', timeit.timeit(self.run_numpy_add, number=n_trials) - print 'nd4j sub ', timeit.timeit(self.run_nd4j_sub, number=n_trials) - print 'numpy sub ', timeit.timeit(self.run_numpy_sub, number=n_trials) - print 'nd4j mmul ', timeit.timeit(self.run_nd4j_mmul, number=n_trials) - print 'numpy mmul ', timeit.timeit(self.run_numpy_mmul, number=n_trials) - - -benchmark = Benchmark() -benchmark.run_benchmark() diff --git a/contrib/attic/jumpy/jumpy/__init__.py b/contrib/attic/jumpy/jumpy/__init__.py deleted file mode 100644 index 563fbace8..000000000 --- a/contrib/attic/jumpy/jumpy/__init__.py +++ /dev/null @@ -1,32 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .ndarray import * -from .matlib import * -from .memory_manager import * -from .ops import * -from .tf_model import * -from .keras_model import * -from .spark import * diff --git a/contrib/attic/jumpy/jumpy/java_classes.py b/contrib/attic/jumpy/jumpy/java_classes.py deleted file mode 100644 index 33e4ee2b5..000000000 --- a/contrib/attic/jumpy/jumpy/java_classes.py +++ /dev/null @@ -1,83 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import jnius_config -import os -import warnings -import pydl4j - -pydl4j.validate_nd4j_jars() - - -# -------------JVM starts here------------- - -from jnius import autoclass - -Nd4j = autoclass('org.nd4j.linalg.factory.Nd4j') -INDArray = autoclass('org.nd4j.linalg.api.ndarray.INDArray') -Transforms = autoclass('org.nd4j.linalg.ops.transforms.Transforms') -NDArrayIndex = autoclass('org.nd4j.linalg.indexing.NDArrayIndex') -DataBuffer = autoclass('org.nd4j.linalg.api.buffer.DataBuffer') -DataType = autoclass('org.nd4j.linalg.api.buffer.DataType') -System = autoclass('java.lang.System') -Integer = autoclass('java.lang.Integer') -Long = autoclass('java.lang.Long') -Float = autoclass('java.lang.Float') -Double = autoclass('java.lang.Double') -Shape = autoclass('org.nd4j.linalg.api.shape.Shape') -BinarySerde = autoclass('org.nd4j.serde.binary.BinarySerde') -NativeOpsHolder = autoclass('org.nd4j.nativeblas.NativeOpsHolder') - -DoublePointer = autoclass('org.bytedeco.javacpp.DoublePointer') -FloatPointer = autoclass('org.bytedeco.javacpp.FloatPointer') -HalfPointer = autoclass('org.bytedeco.javacpp.ShortPointer') -LongPointer = autoclass('org.bytedeco.javacpp.LongPointer') -IntPointer = autoclass('org.bytedeco.javacpp.IntPointer') -ShortPointer = autoclass('org.bytedeco.javacpp.ShortPointer') -BoolPointer = autoclass('org.bytedeco.javacpp.BoolPointer') - - -DataTypeUtil = autoclass('org.nd4j.linalg.api.buffer.util.DataTypeUtil') -MemoryManager = autoclass('org.nd4j.linalg.memory.MemoryManager') -SameDiff = autoclass('org.nd4j.autodiff.samediff.SameDiff') -TFGraphMapper = autoclass('org.nd4j.imports.graphmapper.tf.TFGraphMapper') -JDataset = autoclass('org.nd4j.linalg.dataset.DataSet') -ArrayList = autoclass('java.util.ArrayList') - - -def KerasModelImport(): - return autoclass('org.deeplearning4j.nn.modelimport.keras.KerasModelImport') - - -def ArrayDescriptor(): - return autoclass('org.deeplearning4j.spark.parameterserver.python.ArrayDescriptor') - - -def DatasetDescriptor(): - return autoclass('org.deeplearning4j.spark.parameterserver.python.DataSetDescriptor') - - -def spark_utils(): - return autoclass('org.deeplearning4j.spark.parameterserver.python.Utils') diff --git a/contrib/attic/jumpy/jumpy/keras_model.py b/contrib/attic/jumpy/jumpy/keras_model.py deleted file mode 100644 index cf363fc5d..000000000 --- a/contrib/attic/jumpy/jumpy/keras_model.py +++ /dev/null @@ -1,59 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from .java_classes import KerasModelImport -from .ndarray import array - - -class KerasModel(object): - def __init__(self, filepath): - KerasModelImport = KerasModelImport() - try: - self.model = KerasModelImport.importKerasModelAndWeights(filepath) - self.is_sequential = False - except Exception: - self.model = KerasModelImport.importKerasSequentialModelAndWeights(filepath) - self.is_sequential = True - - def __call__(self, input): - if self.is_sequential: - if type(input) in [list, tuple]: - n = len(input) - if n != 1: - err = 'Expected 1 input to sequential model. Received {}.'.format(n) - raise ValueError(err) - input = input[0] - input = array(input).array - out = self.model.output(input, False) - out = array(out) - return out - else: - if not isinstance(input, list): - input = [input] - input = [array(x).array for x in input] - out = self.model.output(False, *input) - out = [array(x) for x in out] - if len(out) == 1: - return out[0] - return out diff --git a/contrib/attic/jumpy/jumpy/matlib.py b/contrib/attic/jumpy/jumpy/matlib.py deleted file mode 100644 index 699b68ed8..000000000 --- a/contrib/attic/jumpy/jumpy/matlib.py +++ /dev/null @@ -1,59 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .ndarray import ndarray -from .java_classes import Nd4j - - -def zeros(shape): - return ndarray(Nd4j.zeros(*shape)) - - -def ones(shape): - return ndarray(Nd4j.ones(*shape)) - - -def zeros_like(array): - array = ndarray(array).array - return ndarray(Nd4j.zerosLike(array)) - - -def ones_like(array): - array = ndarray(array).array - return ndarray(Nd4j.onesLike(array)) - - -def eye(size): - return ndarray(Nd4j.eye(size)) - - -def arange(m, n=None): - if n is None: - return ndarray(Nd4j.arange(m)) - return ndarray(Nd4j.arange(m, n)) - - -def linspace(start, stop, num): - return ndarray(Nd4j.linspace(start, stop, num)) diff --git a/contrib/attic/jumpy/jumpy/memory_manager.py b/contrib/attic/jumpy/jumpy/memory_manager.py deleted file mode 100644 index 032dd7f52..000000000 --- a/contrib/attic/jumpy/jumpy/memory_manager.py +++ /dev/null @@ -1,40 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .java_classes import Nd4j - -memory_manager = Nd4j.getMemoryManager() - - -def disable_gc(): - memory_manager.togglePeriodicGc(False) - - -def enable_gc(): - memory_manager.togglePeriodicGc(True) - - -def set_gc_interval(interval=5000): - memory_manager.setAutoGcWindow(interval) diff --git a/contrib/attic/jumpy/jumpy/ndarray.py b/contrib/attic/jumpy/jumpy/ndarray.py deleted file mode 100644 index d39b72a4b..000000000 --- a/contrib/attic/jumpy/jumpy/ndarray.py +++ /dev/null @@ -1,520 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .java_classes import * -import numpy as np -import ctypes -import warnings - - -native_ops = NativeOpsHolder.getInstance().getDeviceNativeOps() - - -# DATA TYPE MANAGEMENT - - -DOUBLE = DataType.DOUBLE -FLOAT = DataType.FLOAT -HALF = DataType.HALF -LONG = DataType.LONG -INT = DataType.INT -SHORT = DataType.SHORT -UBYTE = DataType.UBYTE -BYTE = DataType.BYTE -BOOL = DataType.BOOL -UTF8 = DataType.UTF8 -COMPRESSED = DataType.COMPRESSED -UNKNOWN = DataType.UNKNOWN - -SUPPORTED_JAVA_DTYPES = [ - DOUBLE, - FLOAT, - HALF, - - LONG, - INT, - SHORT, - - BOOL - #UTF8 -] - -SUPPORTED_PYTHON_DTYPES = [ - np.float64, - np.float32, - np.float16, - - np.int64, - np.int32, - np.int16, - - np.bool_ - #np.str_ -] - - - - -_PY2J = {SUPPORTED_PYTHON_DTYPES[i] : SUPPORTED_JAVA_DTYPES[i] for i in range(len(SUPPORTED_JAVA_DTYPES))} -_J2PY = {SUPPORTED_JAVA_DTYPES[i] : SUPPORTED_PYTHON_DTYPES[i] for i in range(len(SUPPORTED_JAVA_DTYPES))} - - -def _dtype_py2j(dtype): - if isinstance(dtype, str): - dtype = np.dtype(dtype).type - elif isinstance(dtype, np.dtype): - dtype = dtype.type - jtype = _PY2J.get(dtype) - if jtype is None: - raise NotImplementedError("Unsupported type: " + dtype.name) - return jtype - - -def _dtype_j2py(dtype): - pytype = _J2PY.get(dtype) - if pytype is None: - raise NotImplementedError("Unsupported type: " + (str(dtype))) - return pytype - - -def set_context_dtype(dtype): - ''' - Sets the dtype for nd4j - # Arguments - dtype: 'float' or 'double' - ''' - dtype_map = { - 'float32': 'float', - 'float64': 'double' - } - dtype = dtype_map.get(dtype, dtype) - if dtype not in ['float', 'double']: - raise ValueError("Invalid dtype '{}'. Available dtypes are 'float' and 'double'.".format(dtype)) - dtype_ = DataTypeUtil.getDtypeFromContext(dtype) - DataTypeUtil.setDTypeForContext(dtype_) - if get_context_dtype() != dtype: - warnings.warn("Can not set context dtype now. Set it at the beginning of your program.") - - -def get_context_dtype(): - ''' - Returns the nd4j dtype - ''' - dtype = DataTypeUtil.getDtypeFromContext() - return DataTypeUtil.getDTypeForName(dtype) - -_refs = [] - - -def _from_numpy(np_array): - ''' - Convert numpy array to nd4j array - ''' - pointer_address, _ = np_array.__array_interface__['data'] - _refs.append(np_array) - pointer = native_ops.pointerForAddress(pointer_address) - size = np_array.size - pointer.limit(size) - jdtype = _dtype_py2j(np_array.dtype) - ''' - mapping = { - DOUBLE: DoublePointer, - FLOAT: FloatPointer, - HALF: HalfPointer, - LONG: LongPointer, - INT: IntPointer, - SHORT: ShortPointer, - BOOL: BoolPointer - } - pc = mapping[jdtype] - #pointer = pc(pointer) - ''' - buff = Nd4j.createBuffer(pointer, size, jdtype) - assert buff.address() == pointer_address - _refs.append(buff) - elem_size = buff.getElementSize() - assert elem_size == np_array.dtype.itemsize - strides = np_array.strides - strides = [dim / elem_size for dim in strides] - shape = np_array.shape - nd4j_array = Nd4j.create(buff, shape, strides, 0) - assert buff.address() == nd4j_array.data().address() - return nd4j_array - - -def _to_numpy(nd4j_array): - ''' - Convert nd4j array to numpy array - ''' - buff = nd4j_array.data() - address = buff.pointer().address() - dtype = nd4j_array.dataType().toString() - mapping = { - 'DOUBLE': ctypes.c_double, - 'FLOAT': ctypes.c_float, - 'HALF': ctypes.c_short, - 'LONG': ctypes.c_long, - 'INT': ctypes.c_int, - 'SHORT': ctypes.c_short, - 'BOOL': ctypes.c_bool - } - Pointer = ctypes.POINTER(mapping[dtype]) - pointer = ctypes.cast(address, Pointer) - np_array = np.ctypeslib.as_array(pointer, tuple(nd4j_array.shape())) - return np_array - - -def _indarray(x): - typ = type(x) - if typ is INDArray: - return x - elif typ is ndarray: - return x.array - elif 'numpy' in str(typ): - return _from_numpy(x) - elif typ in (list, tuple): - return _from_numpy(np.array(x)) - elif typ in (int, float): - return Nd4j.scalar(x) - else: - raise Exception('Data type not understood :' + str(typ)) - - -def _nparray(x): - typ = type(x) - if typ is INDArray: - return ndarray(x).numpy() - elif typ is ndarray: - return x.numpy() - elif 'numpy' in str(typ): - return x - elif typ in (list, tuple): - return np.array(x) - elif typ in (int, float): - return np.array(x) - else: - raise Exception('Data type not understood :' + str(typ)) - - -def broadcast_like(y, x): - xs = x.shape() - ys = y.shape() - if xs == ys: - return y - _xs = tuple(xs) - _ys = tuple(ys) - nx = len(xs) - ny = len(ys) - if nx > ny: - diff = nx - ny - ys = ([1] * diff) + ys - y = y.reshape(ys) - ny = nx - elif ny > nx: - raise Exception('Unable to broadcast shapes ' + str(_xs) + '' - ' and ' + str(_ys)) - yt = [] - rep_y = False - for xd, yd in zip(xs, ys): - if xd == yd: - yt.append(1) - elif xd == 1: - raise Exception('Unable to broadcast shapes ' + str(_xs) + '' - ' and ' + str(_ys)) - elif yd == 1: - yt.append(xd) - rep_y = True - else: - raise Exception('Unable to broadcast shapes ' + str(_xs) + '' - ' and ' + str(_ys)) - if rep_y: - y = y.repmat(*yt) - return y - - -def broadcast(x, y): - xs = x.shape() - ys = y.shape() - if xs == ys: - return x, y - _xs = tuple(xs) - _ys = tuple(ys) - nx = len(xs) - ny = len(ys) - if nx > ny: - diff = nx - ny - ys = ([1] * diff) + ys - y = y.reshape(*ys) - ny = nx - elif ny > nx: - diff = ny - nx - xs = ([1] * diff) + xs - x = x.reshape(*xs) - nx = ny - xt = [] - yt = [] - rep_x = False - rep_y = False - for xd, yd in zip(xs, ys): - if xd == yd: - xt.append(1) - yt.append(1) - elif xd == 1: - xt.append(yd) - yt.append(1) - rep_x = True - elif yd == 1: - xt.append(1) - yt.append(xd) - rep_y = True - else: - raise Exception('Unable to broadcast shapes ' + str(_xs) + '' - ' and ' + str(_ys)) - if rep_x: - x = Nd4j.tile(x, *xt) - if rep_y: - try: - y = Nd4j.tile(y, *yt) - except: - y = Nd4j.tile(y, *yt) - return x, y - - -class ndarray(object): - - def __init__(self, data, dtype=None): - # we ignore dtype for now - typ = type(data) - if 'nd4j' in typ.__name__: - # Note that we don't make a copy here - self.array = data - elif typ is ndarray: - self.array = data.array.dup() - else: - if typ is not np.ndarray: - data = np.array(data) - self.array = _from_numpy(data) - - def numpy(self): - try: - return self.np_array - except AttributeError: - self.np_array = _to_numpy(self.array) - return self.np_array - - @property - def size(self): - return self.array.length() - - @property - def shape(self): - return tuple(self.array.shape()) - - @shape.setter - def shape(self, value): - arr = self.reshape(value) - self.array = arr.array - - @property - def ndim(self): - return len(self.array.shape()) - - def __getitem__(self, key): - return ndarray(self.numpy()[key]) - if type(key) is int: - return ndarray(self.array.get(NDArrayIndex.point(key))) - if type(key) is slice: - start = key.start - stop = key.stop - step = key.step - if start is None: - start = 0 - if stop is None: - shape = self.array.shape() - if shape[0] == 1: - stop = shape[1] - else: - stop = shape[0] - if stop - start <= 0: - return None - if step is None or step == 1: - return ndarray(self.array.get(NDArrayIndex.interval(start, stop))) - else: - return ndarray(self.array.get(NDArrayIndex.interval(start, step, stop))) - if type(key) is list: - raise NotImplementedError( - 'Sorry, this type of indexing is not supported yet.') - if type(key) is tuple: - key = list(key) - shape = self.array.shape() - ndim = len(shape) - nk = len(key) - key += [slice(None)] * (ndim - nk) - args = [] - for i, dim in enumerate(key): - if type(dim) is int: - args.append(NDArrayIndex.point(dim)) - elif type(dim) is slice: - if dim == slice(None): - args.append(NDArrayIndex.all()) - else: - start = dim.start - stop = dim.stop - step = dim.step - if start is None: - start = 0 - if stop is None: - stop = shape[i] - if stop - start <= 0: - return None - if step is None or step == 1: - args.append(NDArrayIndex.interval(start, stop)) - else: - args.append(NDArrayIndex.interval( - start, step, stop)) - elif type(dim) in (list, tuple): - raise NotImplementedError( - 'Sorry, this type of indexing is not supported yet.') - return ndarray(self.array.get(*args)) - - def __setitem__(self, key, other): - self.numpy()[key] = _nparray(other) - return - other = _indarray(other) - view = self[key] - if view is None: - return - view = view.array - other = broadcast_like(other, view) - view.assign(other) - - def __add__(self, other): - return ndarray(self.numpy() + _nparray(other)) - other = _indarray(other) - x, y = broadcast(self.array, other) - return ndarray(x.add(y)) - - def __sub__(self, other): - return ndarray(self.numpy() - _nparray(other)) - other = _indarray(other) - x, y = broadcast(self.array, other) - return ndarray(x.sub(y)) - - def __mul__(self, other): - return ndarray(self.numpy() * _nparray(other)) - other = _indarray(other) - x, y = broadcast(self.array, other) - return ndarray(x.mul(y)) - - def __div__(self, other): - return ndarray(self.numpy() / _nparray(other)) - other = _indarray(other) - x, y = broadcast(self.array, other) - return ndarray(x.div(y)) - - def __pow__(self, other): - return ndarray(self.numpy() ** _nparray(other)) - other = _indarray(other) - x, y = broadcast(self.array, other) - return ndarray(Transforms.pow(x, y)) - - def __iadd__(self, other): - self.numpy().__iadd__(_nparray(other)) - return self - other = _indarray(other) - if self.array.shape() == other.shape(): - self.array = self.array.addi(other) - else: - x, y = broadcast(self.array, other) - self.array = x.add(y) - return self - - def __isub__(self, other): - self.numpy().__isub__(_nparray(other)) - return self - other = _indarray(other) - if self.array.shape() == other.shape(): - self.array = self.array.subi(other) - else: - x, y = broadcast(self.array, other) - self.array = x.sub(y) - return self - - def __imul__(self, other): - self.numpy().__imul__(_nparray(other)) - return self - other = _indarray(other) - if self.array.shape() == other.shape(): - self.array = self.array.muli(other) - else: - x, y = broadcast(self.array, other) - self.array = x.mul(y) - return self - - def __idiv__(self, other): - self.numpy().__idiv__(_nparray(other)) - return self - other = _indarray(other) - if self.array.shape() == other.shape(): - self.array = self.array.divi(other) - else: - x, y = broadcast(self.array, other) - self.array = x.div(y) - return self - - def __ipow__(self, other): - self.numpy().__ipow__(_nparray(other)) - return self - other = _indarray(other) - if self.array.shape() == other.shape(): - self.array = self.array.divi(other) - else: - x, y = broadcast(self.array, other) - self.array = Transforms.pow(x, y) - return self - - def __getattr__(self, attr): - import ops - f = getattr(ops, attr) - setattr(ndarray, attr, f) - return getattr(self, attr) - - def __int__(self): - if self.array.length() == 1: - return self.array.getInt(0) - raise Exception('Applicable only for scalars') - - def __float__(self): - if self.array.length() == 1: - return self.array.getDouble(0) - raise Exception('Applicable only for scalars') - - @property - def T(self): - return self.transpose() - - -def array(*args, **kwargs): - return ndarray(*args, **kwargs) diff --git a/contrib/attic/jumpy/jumpy/ops/__init__.py b/contrib/attic/jumpy/jumpy/ops/__init__.py deleted file mode 100644 index 642726e39..000000000 --- a/contrib/attic/jumpy/jumpy/ops/__init__.py +++ /dev/null @@ -1,28 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .array_manip import * -from .linalg import * -from .reduction import * diff --git a/contrib/attic/jumpy/jumpy/ops/array_manip.py b/contrib/attic/jumpy/jumpy/ops/array_manip.py deleted file mode 100644 index 7cd55219c..000000000 --- a/contrib/attic/jumpy/jumpy/ops/array_manip.py +++ /dev/null @@ -1,158 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .op import op -from ..java_classes import Nd4j -from ..ndarray import _nparray, ndarray, _indarray - -# Array manipulation routines -# https://docs.scipy.org/doc/numpy-1.13.0/reference/routines.array-manipulation.html - - -@op -def reshape(arr, *args): - if len(args) == 1 and type(args) in (list, tuple): - args = tuple(args[0]) - return arr.reshape(*args) - - -@op -def transpose(arr, *axis): - if len(axis) == 0: - return arr.transpose() - else: - if len(axis) == 1: - axis = axis[0] - assert set(axis) in [set(list(range(len(axis)))), - set(list(range(len(arr.shape()))))] - return arr.permute(*axis) - - -@op -def ravel(arr): - return arr.ravel() - - -@op -def flatten(arr): - return arr.ravel().dup() - - -@op -def moveaxis(arr, source, destination): - assert type(source) == type( - destination), 'source and destination should be of same type.' - shape = arr.shape() - ndim = len(shape) - x = list(range(ndim)) - if type(source) is int: - if source < 0: - source += ndim - if destination < 0: - destination += ndim - z = x.pop(source) - x.insert(destination, z) - return arr.permute(*x) - if type(source) in (list, tuple): - source = list(source) - destination = list(destination) - assert len(source) == len(destination) - for src, dst in zip(source, destination): - if src < 0: - src += ndim - if dst < 0: - dst += ndim - z = x.pop(src) - x.insert(dst, z) - return arr.permute(*x) - - -@op -def permute(arr, *axis): - if len(axis) == 1: - axis = axis[0] - assert set(axis) in [set(list(range(len(axis)))), - set(list(range(len(arr.shape()))))] - return arr.permute(*axis) - - -@op -def expand_dims(arr, axis): - return Nd4j.expandDims(arr, axis) - - -@op -def squeeze(arr, axis): - shape = arr.shape() - if type(axis) in (list, tuple): - shape = [shape[i] for i in range(len(shape)) if i not in axis] - else: - shape.pop(axis) - return arr.reshape(*shape) - - -@op -def concatenate(arrs, axis=-1): - return Nd4j.concat(axis, *arrs) - - -@op -def hstack(arrs): - return Nd4j.hstack(arrs) - - -@op -def vstack(arrs): - return Nd4j.vstack(arrs) - - -@op -def stack(arrs, axis): - for i, arr in enumerate(arrs): - shape = arr.shape() - shape.insert(axis, 1) - arrs[i] = arr.reshape(*shape) - return Nd4j.concat(axis, *arrs) - - -@op -def tile(arr, reps): - import numpy as np - return _indarray(np.tile(_nparray(arr), reps)) - - if type(reps) is int: - return Nd4j.tile(arr, reps) - else: - return Nd4j.tile(arr, *reps) - - -@op -def repeat(arr, repeats, axis=None): - if type(repeats) is int: - repeats = (repeats,) - if axis is None: - return arr.repeat(-1, *repeats).reshape(-1) - else: - return arr.repeat(axis, *repeats) diff --git a/contrib/attic/jumpy/jumpy/ops/linalg.py b/contrib/attic/jumpy/jumpy/ops/linalg.py deleted file mode 100644 index 823723abc..000000000 --- a/contrib/attic/jumpy/jumpy/ops/linalg.py +++ /dev/null @@ -1,50 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .op import op -from ..java_classes import * - - -# Linear algebra -# https://docs.scipy.org/doc/numpy-1.13.0/reference/routines.linalg.html - - -@op -def dot(arr, other): - return arr.mmul(other) - - -@op -def tensordot(arr1, arr2, axes=2): - shape1 = arr1.shape() - shape2 = arr2.shape() - if type(axes) is int: - axes = [shape1[axes:], shape2[:axes]] - elif type(axes) in [list, tuple]: - axes = list(axes) - for i in range(2): - if type(axes[i]) is int: - axes[i] = [axes[i]] - return Nd4j.tensorMmul(arr1, arr2, axes) diff --git a/contrib/attic/jumpy/jumpy/ops/op.py b/contrib/attic/jumpy/jumpy/ops/op.py deleted file mode 100644 index 776ba9809..000000000 --- a/contrib/attic/jumpy/jumpy/ops/op.py +++ /dev/null @@ -1,103 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from jumpy.java_classes import * -from jumpy.ndarray import array -from jumpy.ndarray import ndarray -import inspect - -_INDArray_class = 'org.nd4j.linalg.api.ndarray.INDArray' - - -def _is_nd4j(x): - return type(x).__name__ == _INDArray_class - - -def _is_jumpy(x): - return type(x) == ndarray - - -''' -Use the @op decorator over a method to automatically -take care of nd4j<->jumpy conversions. e.g: - -```python - -@op -def reshape(arr, shape): - # we are in nd4j space now - # arr is an INDArray instance - # we return a INDArray instance as well - return arr.reshape(*shape) - - -# use in jumpy space: - -x = jp.zeros((2, 2, 3)) # x is jumpy ndarray -y = reshape(x, (4, 3)) # y is a jumpy ndarray - -``` - -Note that methods with first argument named 'arr' -will be automatically bound to ndarray class. - -''' - - -def op(f): - def wrapper(*args, **kwargs): - args = list(args) - for i, arg in enumerate(args): - if _is_jumpy(arg): - args[i] = arg.array - elif type(arg) is list: - for j, a in enumerate(arg): - if _is_jumpy(a): - arg[j] = a.array - elif type(arg) is tuple: - arg = list(arg) - for j, a in enumerate(arg): - if _is_jumpy(a): - arg[j] = a.array - args[i] = tuple(arg) - for k in kwargs: - v = kwargs[k] - if _is_jumpy(v): - kwargs[k] = v.array - out = f(*args, **kwargs) - if _is_nd4j(out): - return array(out) - elif type(out) is list: - for i, v in enumerate(out): - if _is_nd4j(v): - out[i] = array(v) - return out - elif type(out) is tuple: - out = list(out) - for i, v in enumerate(out): - if _is_nd4j(v): - out[i] = array(v) - return tuple(out) - return wrapper diff --git a/contrib/attic/jumpy/jumpy/ops/reduction.py b/contrib/attic/jumpy/jumpy/ops/reduction.py deleted file mode 100644 index 93a51efa4..000000000 --- a/contrib/attic/jumpy/jumpy/ops/reduction.py +++ /dev/null @@ -1,42 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from .op import op -from ..java_classes import Nd4j - - -template = """ -@op -def {}(arr, axis=None): - if axis is None: - return Nd4j.{}(arr) - return Nd4j.{}(arr) -""" - -reduction_ops = [['max'], ['min'], ['sum'], ['prod'], ['mean'], [ - 'std'], ['var'], ['argmax', 'argMax'], ['argmin', 'argMin']] - -for rop in reduction_ops: - code = template.format(rop[0], rop[-1], rop[-1]) - exec(code) diff --git a/contrib/attic/jumpy/jumpy/spark/__init__.py b/contrib/attic/jumpy/jumpy/spark/__init__.py deleted file mode 100644 index a1b9c2dcc..000000000 --- a/contrib/attic/jumpy/jumpy/spark/__init__.py +++ /dev/null @@ -1,29 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .fast_impl import py2javaArrayRDD, java2pyArrayRDD -from .fast_impl import py2javaDatasetRDD, java2pyDatasetRDD -from .dataset import Dataset -# from .naive_impl import py2javaRDD, java2pyRDD diff --git a/contrib/attic/jumpy/jumpy/spark/dataset.py b/contrib/attic/jumpy/jumpy/spark/dataset.py deleted file mode 100644 index 84daadada..000000000 --- a/contrib/attic/jumpy/jumpy/spark/dataset.py +++ /dev/null @@ -1,57 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from ..ndarray import ndarray -from ..java_classes import JDataset - - -class Dataset(object): - - def __init__(self, features, labels, features_mask=None, labels_mask=None): - self.features = ndarray(features) - self.labels = ndarray(labels) - if features_mask is None: - self.features_mask = None - else: - self.features_mask = ndarray(features_mask) - if labels_mask is None: - self.labels_mask = None - else: - self.labels_mask = ndarray(labels_mask) - - def to_java(self): - return JDataset(self.features.array, self.labels.array, self.features_mask, self.labels_mask) - - def __getstate__(self): - return [self.features.numpy(), - self.labels.numpy(), - self.features_mask.numpy() if self.features_mask is not None else None, - self.labels_mask.numpy() if self.labels_mask is not None else None] - - def __setstate__(self, state): - ds = Dataset(*state) - self.features = ds.features - self.labels = ds.labels - self.features_mask = ds.features_mask - self.labels_mask = ds.labels_mask diff --git a/contrib/attic/jumpy/jumpy/spark/fast_impl.py b/contrib/attic/jumpy/jumpy/spark/fast_impl.py deleted file mode 100644 index 27d23ff32..000000000 --- a/contrib/attic/jumpy/jumpy/spark/fast_impl.py +++ /dev/null @@ -1,168 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import numpy as np -from ..java_classes import ArrayList -from ..java_classes import ArrayDescriptor as getArrayDescriptor -from ..java_classes import DatasetDescriptor as getDatasetDescriptor -from ..java_classes import DataType -from ..java_classes import spark_utils as get_spark_utils -from ..java_classes import JDataset -from ..ndarray import array -from .utils import np2desc -from .utils import py2j_ds_desc -from .utils import j2py_ds_desc -from .utils import j2py_arr_desc -from .utils import py2j_arr_desc -from .utils import desc2np -from .utils import desc2ds -from .utils import ds2desc - - -ArrayDescriptor = None -JDatasetDescriptor = None -spark_utils = None - - - - -def java2pyArrayRDD(java_rdd, py_sc): - ''' - Arguments - - `java_rdd`: JavaRDD instance - `py_sc`: Pyspark context instance - - Returns - - pyspark.RDD instance - ''' - global spark_utils - if spark_utils is None: - spark_utils = get_spark_utils() - desc_rdd = spark_utils.getArrayDescriptorRDD(java_rdd) - descriptors = desc_rdd.collect() - num_descriptors = descriptors.size() - nparrays = [] - pydescriptors = [] - for i in range(num_descriptors): - jdesc = descriptors.get(i) - pydesc = j2py_arr_desc(jdesc) - nparrays.append(desc2np(pydesc)) - #pydescriptors.append(pydesc) - #pyrdd = py_sc.parallelize(pydescriptors) - #pyrdd = pyrdd.map(desc2np) - pyrdd = py_sc.parallelize(nparrays) - return pyrdd - - -def py2javaArrayRDD(py_rdd, java_sc): - ''' - Arguments - - `py_rdd`: pyspark.RDD instance - `java_sc`: JavaSparkContext instance - - Returns - - JavaRDD instance - ''' - global ArrayDescriptor, spark_utils - if ArrayDescriptor is None: - ArrayDescriptor = getArrayDescriptor() - if spark_utils is None: - spark_utils = get_spark_utils() - - #desc_rdd = py_rdd.map(np2desc) - #descriptors = desc_rdd.collect() - arrlist = ArrayList() - nparrays = py_rdd.collect() - for nparr in nparrays: - arrlist.add(array(nparr).array) - return java_sc.parallelize(arrlist) - for d in descriptors: - #arrlist.add(array(desc2np(d)).array) - arrlist.add(ArrayDescriptor(d[0], d[1], d[2], dtype_map[d[3]], 'c').getArray()) - java_rdd = java_sc.parallelize(arrlist) - #return java_rdd - java_rdd = spark_utils.getArrayRDD(java_rdd) - return java_rdd - - -def java2pyDatasetRDD(java_rdd, py_sc): - global spark_utils, JDatasetDescriptor - if spark_utils is None: - spark_utils = get_spark_utils() - if JDatasetDescriptor is None: - JDatasetDescriptor = getDatasetDescriptor() - jdatasets = java_rdd.collect() - num_ds = jdatasets.size() - pydatasets = [] - for i in range(num_ds): - jds = jdatasets.get(i) - jdesc = JDatasetDescriptor(jds) - pydesc = j2py_ds_desc(jdesc) - pyds = desc2ds(pydesc) - pydatasets.append(pyds) - return py_sc.parallelize(pydatasets) - - - #### - desc_rdd = spark_utils.getDataSetDescriptorRDD(java_rdd) - descriptors = desc_rdd.collect() - num_descriptors = descriptors.size() - pydescriptors = [] - for i in range(num_descriptors): - jdesc = descriptors.get(i) - pydesc = j2py_ds_desc(jdesc) - pydescriptors.append(pydesc) - pyrdd = py_sc.parallelize(pydescriptors) - pyrdd = pyrdd.map(desc2ds) - return pyrdd - - -def py2javaDatasetRDD(py_rdd, java_sc): - global spark_utils - if spark_utils is None: - spark_utils = get_spark_utils() - - ### - pydatasets = py_rdd.collect() - jdatasets = ArrayList() - for pyds in pydatasets: - pydesc = ds2desc(pyds) - jdesc = py2j_ds_desc(pydesc) - jds = jdesc.getDataSet() - jdatasets.add(jds) - return java_sc.parallelize(jdatasets) - - ### - desc_rdd = py_rdd.map(ds2desc) - pydescriptors = desc_rdd.collect() - jdescriptors = ArrayList() - for pydesc in pydescriptors: - jdescriptors.add(py2j_ds_desc(pydesc)) - java_rdd = java_sc.parallelize(jdescriptors) - java_rdd = spark_utils.getDataSetRDD(java_rdd) - return java_rdd diff --git a/contrib/attic/jumpy/jumpy/spark/naive_impl.py b/contrib/attic/jumpy/jumpy/spark/naive_impl.py deleted file mode 100644 index 049264971..000000000 --- a/contrib/attic/jumpy/jumpy/spark/naive_impl.py +++ /dev/null @@ -1,72 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import numpy as np -from ..java_classes import ArrayList -from ..ndarray import array - - -def java2pyRDD(java_rdd, py_sc): - ''' - Arguments - - `java_rdd`: JavaRDD instance - `py_sc`: Pyspark context instance - - Returns - - pyspark.RDD instance - ''' - indarray_list = java_rdd.collect() - num_arrays = indarray_list.size() - - nparray_list = [] - for i in range(num_arrays): - indarray = indarray_list.get(i) - jparray = array(indarray) - nparray = jparray.numpy() - nparray_list.append(nparray) - - return py_sc.parallelize(nparray_list) - - -def py2javaRDD(py_rdd, java_sc): - ''' - Arguments - - `py_rdd`: pyspark.RDD instance - `java_sc`: JavaSparkContext instance - - Returns - - JavaRDD instance - ''' - nparray_list = py_rdd.collect() - indarray_list = ArrayList() - - for nparray in nparray_list: - jparray = array(nparray) - indarray = jparray.array - indarray_list.add(indarray) - return java_sc.parallelize(indarray_list) diff --git a/contrib/attic/jumpy/jumpy/spark/utils.py b/contrib/attic/jumpy/jumpy/spark/utils.py deleted file mode 100644 index fdddf7073..000000000 --- a/contrib/attic/jumpy/jumpy/spark/utils.py +++ /dev/null @@ -1,130 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import numpy as np -from ..java_classes import DataType -from ..java_classes import ArrayDescriptor as getArrayDescriptor -from ..java_classes import DatasetDescriptor -import ctypes -from .dataset import Dataset -from ..ndarray import array - - -ArrayDescriptor = None - - -def np2desc(nparray): - if nparray is None: - return None - nparray = array(nparray).numpy() - address = nparray.__array_interface__['data'][0] - shape = nparray.shape - stride = nparray.strides - nptype = nparray.dtype - if nptype == np.float32: - dtype = "float" - elif nptype == np.float64: - dtype = "double" - else: - raise Exception("Unsupported data type: " + str(nptype)) - return (address, shape, stride, dtype) - - -def desc2np(desc): - if desc is None: - return None - address, shape, stride, dtype = desc - mapping = { - 'double': ctypes.c_double, - 'float': ctypes.c_float, - 'half': ctypes.c_short, - 'long': ctypes.c_long, - 'int': ctypes.c_int, - 'short': ctypes.c_short, - 'bool': ctypes.c_bool - } - Pointer = ctypes.POINTER(mapping[dtype]) - pointer = ctypes.cast(address, Pointer) - np_array = np.ctypeslib.as_array(pointer, shape) - return np_array - - -def desc2ds(desc): - if desc is None: - return None - return Dataset(*list(map(desc2np, desc))) - - -def ds2desc(ds): - if ds is None: - return None - items = [ds.features, ds.labels, ds.features_mask, ds.labels_mask] - return tuple(map(np2desc, items)) - - -def j2py_arr_desc(jdesc): - if jdesc is None: - return None - address = jdesc.getAddress() - shape = tuple(jdesc.getShape()) - stride = tuple(jdesc.getStride()) - dtype = jdesc.getType().toString().lower() - supported_dtypes = ["float", "double"] - if dtype not in supported_dtypes: - raise Exception("Unsupported data type: " + dtype) - return (address, shape, stride, dtype) - - -def py2j_arr_desc(pydesc): - global ArrayDescriptor - if pydesc is None: - return None - address = pydesc[0] - shape = pydesc[1] - stride = pydesc[2] - dtype = pydesc[3] - dtype = {"float": DataType.FLOAT, "double": DataType.DOUBLE}[dtype] - if ArrayDescriptor is None: - ArrayDescriptor = getArrayDescriptor() - return ArrayDescriptor(address, shape, stride, dtype, 'c') - - -def j2py_ds_desc(jdesc): - jfeaturesdesc = jdesc.getFeatures() - pyfeaturesdesc = j2py_arr_desc(jfeaturesdesc) - jlabelsdesc = jdesc.getLabels() - pylabelsdesc = j2py_arr_desc(jlabelsdesc) - - jfmaskdesc = jdesc.getFeaturesMask() - pyfmaskdesc = j2py_arr_desc(jfmaskdesc) - - jlmaskdesc = jdesc.getLabelsMask() - pylmaskdesc = j2py_arr_desc(jlmaskdesc) - - return (pyfeaturesdesc, pylabelsdesc, pyfmaskdesc, pylmaskdesc) - - -def py2j_ds_desc(pydesc): - return DatasetDescriptor()(*list(map(py2j_arr_desc, pydesc))) diff --git a/contrib/attic/jumpy/jumpy/tf_model.py b/contrib/attic/jumpy/jumpy/tf_model.py deleted file mode 100644 index 94deb07e3..000000000 --- a/contrib/attic/jumpy/jumpy/tf_model.py +++ /dev/null @@ -1,77 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from .java_classes import TFGraphMapper, Nd4j, NDArrayIndex -from .ndarray import array - - -class TFModel(object): - def __init__(self, filepath, inputs=None, outputs=None): - self.sd = TFGraphMapper.getInstance().importGraph(filepath) - self.inputs = inputs - self.outputs = outputs - if inputs is None: - input_vars = [self.sd.variables().get(0)] - elif type(inputs) in [list, tuple]: - input_vars = [] - for x in inputs: - var = self.sd.getVariable(x) - if var is None: - raise ValueError('Variable not found in samediff graph: ' + x) - input_vars.append(var) - else: - input_vars = [self.sd.getVariable(inputs)] - if input_vars[0] is None: - raise ValueError('Variable not found in samediff graph: ' + inputs) - if outputs is None: - nvars = self.sd.variables().size() - output_vars = [self.sd.variables().get(nvars - 1)] - elif type(outputs) in [list, tuple]: - output_vars = [] - for x in outputs: - var = self.sd.getVariable(x) - if var is None: - raise ValueError('Variable not found in samediff graph: ' + x) - output_vars.append(var) - else: - output_vars = [self.sd.getVariable(outputs)] - if output_vars[0] is None: - raise ValueError('Variable not found in samediff graph: ' + outputs) - self.input_vars = input_vars - self.output_vars = output_vars - - def __call__(self, input): - if type(input) in (list, tuple): - input_arrays = [array(x).array for x in input] - else: - input_arrays = [array(input).array] - for arr, var in zip(input_arrays, self.input_vars): - self.sd.associateArrayWithVariable(arr, var) - output_arrays = [] - getattr(self.sd, 'exec')() - for var in self.output_vars: - output_arrays.append(array(var.getArr())) - if len(output_arrays) == 1: - return output_arrays[0] - return output_arrays diff --git a/contrib/attic/jumpy/pom.xml b/contrib/attic/jumpy/pom.xml deleted file mode 100644 index d3ee2ebc0..000000000 --- a/contrib/attic/jumpy/pom.xml +++ /dev/null @@ -1,175 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j - 1.0.0-SNAPSHOT - - - jumpy - - jumpy - - - false - 0.2.4 - nd4j-native - - - - - org.nd4j - ${nd4j.backend} - ${dl4j.version} - - - - - - - org.apache.maven.plugins - maven-shade-plugin - ${maven-shade-plugin.version} - - - package - - shade - - - - - org.deeplearning4j.example.App - - - - - - - - org.apache.maven.plugins - maven-compiler-plugin - - - org.apache.maven.plugins - maven-jar-plugin - ${maven-jar-plugin.version} - - true - - - - empty-javadoc-jar - package - - jar - - - javadoc - ${basedir}/javadoc - - - - empty-sources-jar - package - - jar - - - sources - ${basedir}/src - - - - - - org.codehaus.mojo - exec-maven-plugin - ${exec-maven-plugin.version} - - - python-install-cython - install - - exec - - - pip - ${basedir} - - install - --user - Cython - --install-option=--no-cython-compile - - - - - python-build - install - - exec - - - pip - ${basedir} - - install - --user - -e - .[tests, spark] - - - - - python-test - test - - exec - - - python - ${basedir} - ${jumpy.test.skip} - - -m - pytest - --pep8 - -m - pep8 - tests/ - - - - - - - - - diff --git a/contrib/attic/jumpy/pytest.ini b/contrib/attic/jumpy/pytest.ini deleted file mode 100644 index d65bb4cc4..000000000 --- a/contrib/attic/jumpy/pytest.ini +++ /dev/null @@ -1,28 +0,0 @@ -################################################################################ -# Copyright (c) 2015-2018 Skymind, Inc. -# -# This program and the accompanying materials are made available under the -# terms of the Apache License, Version 2.0 which is available at -# https://www.apache.org/licenses/LICENSE-2.0. -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# License for the specific language governing permissions and limitations -# under the License. -# -# SPDX-License-Identifier: Apache-2.0 -################################################################################ - -[pytest] - -norecursedirs= build - -# PEP-8 The following are ignored: -# E501 line too long (82 > 79 characters) -# W503 line break occurred before a binary operator -# E402 module level import not at top of file - -pep8ignore=* E501 \ - * W503 \ - * E402 diff --git a/contrib/attic/jumpy/release.sh b/contrib/attic/jumpy/release.sh deleted file mode 100644 index 73c3264e9..000000000 --- a/contrib/attic/jumpy/release.sh +++ /dev/null @@ -1,34 +0,0 @@ -#!/bin/bash -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - - - - - -# remove old wheels -sudo rm -rf dist/* - -# Build Python 2 & 3 wheels for current version -sudo python2 setup.py sdist bdist_wheel -sudo python3 setup.py sdist bdist_wheel - -# Upload to PyPI with twine. Needs full "skymind" credentials in ~/.pypirc -twine upload dist/* \ No newline at end of file diff --git a/contrib/attic/jumpy/requirements.txt b/contrib/attic/jumpy/requirements.txt deleted file mode 100644 index 1c5cbce15..000000000 --- a/contrib/attic/jumpy/requirements.txt +++ /dev/null @@ -1,5 +0,0 @@ -Cython==0.28.2 -numpy==1.14.2 -pyjnius==1.1.1 -pytest==3.5.1 -six==1.11.0 diff --git a/contrib/attic/jumpy/setup.cfg b/contrib/attic/jumpy/setup.cfg deleted file mode 100644 index fbf73a23a..000000000 --- a/contrib/attic/jumpy/setup.cfg +++ /dev/null @@ -1,5 +0,0 @@ -[metadata] -description-file = README.md - -[aliases] -test=pytest \ No newline at end of file diff --git a/contrib/attic/jumpy/setup.py b/contrib/attic/jumpy/setup.py deleted file mode 100644 index 4170a7497..000000000 --- a/contrib/attic/jumpy/setup.py +++ /dev/null @@ -1,54 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from setuptools import setup -from setuptools import find_packages - -setup(name='jumpy', - version='0.2.4', - description='Numpy and nd4j interop', - long_description='Mapping of the numpy & nd4j array representations', - author='Adam Gibson', - author_email='adam@skymind.io', - classifiers=[ - 'Development Status :: 3 - Alpha', - 'License :: OSI Approved :: Apache Software License', - 'Operating System :: OS Independent', - 'Programming Language :: Python', - 'Programming Language :: Python :: 2', - 'Programming Language :: Python :: 3', - 'Topic :: Software Development :: Libraries', - ], - keywords='numpy jumpy java nd4j deeplearning4j', - url='https://github.com/eclipse/deeplearning4j.git', - license='Apache', - setup_requires=['Cython', 'pytest-runner'], - install_requires=['Cython', 'requests', 'pydl4j', 'numpy'], - extras_require={ - 'spark': ['pyspark'], - 'tests': ['pytest', - 'pytest-pep8', - 'mock'], - }, - packages=find_packages()) diff --git a/contrib/attic/jumpy/tests/__init__.py b/contrib/attic/jumpy/tests/__init__.py deleted file mode 100644 index 96f8e0902..000000000 --- a/contrib/attic/jumpy/tests/__init__.py +++ /dev/null @@ -1,29 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import unittest - -if __name__ == '__main__': - unittest.main() diff --git a/contrib/attic/jumpy/tests/jumpy/__init__.py b/contrib/attic/jumpy/tests/jumpy/__init__.py deleted file mode 100644 index ae84499f2..000000000 --- a/contrib/attic/jumpy/tests/jumpy/__init__.py +++ /dev/null @@ -1,23 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ diff --git a/contrib/attic/jumpy/tests/jumpy/test_array_creation.py b/contrib/attic/jumpy/tests/jumpy/test_array_creation.py deleted file mode 100644 index ef27eacf9..000000000 --- a/contrib/attic/jumpy/tests/jumpy/test_array_creation.py +++ /dev/null @@ -1,40 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import pytest - -import jumpy as jp -import numpy as np - - -def test_array_creation(): - a = jp.zeros((32, 10)) - assert int(jp.sum(a)) == 0 - a = jp.ones((32, 12)) - assert int(jp.sum(a)) == 32 * 12 - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/jumpy/tests/jumpy/test_broadcast.py b/contrib/attic/jumpy/tests/jumpy/test_broadcast.py deleted file mode 100644 index 53b5bd4ba..000000000 --- a/contrib/attic/jumpy/tests/jumpy/test_broadcast.py +++ /dev/null @@ -1,83 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import pytest - -import jumpy as jp -import numpy as np -from numpy.testing import assert_allclose - - -def _test_ufunc(op, shape1, shape2): - a_np = np.random.random(shape1) - b_np = np.random.random(shape2) - - c_np = eval('a_np {} b_np'.format(op)) - - a_jp = jp.array(a_np) - b_jp = jp.array(b_np) - - c_jp = eval('a_jp {} b_jp'.format(op)) - - c_jp = c_jp.numpy() - - assert_allclose(c_jp, c_np) - - -def _test_ufunc_inplace(op, shape1, shape2): - a_np = np.random.random(shape1) - b_np = np.random.random(shape2) - a_np2 = a_np.copy() - exec('a_np {}= b_np'.format(op)) - - a_jp = jp.array(a_np2) - b_jp = jp.array(b_np) - - exec('a_jp {}= b_jp'.format(op)) - - a_jp = a_jp.numpy() - - assert_allclose(a_jp, a_np) - - -def test_broadcast(): - shapes = [ - [(2, 3), (3, )], - [(2, 3, 4), (3, 4)], - [(2, 3), (1, 1)], - [(2, 3), (1, 1, 1)] - ] - - ops = ['+', '-', '*', '/'] - for op in ops: - for shape in shapes: - _test_ufunc(op, *shape) - _test_ufunc(op, *reversed(shape)) - if len(shape[0]) > len(shape[1]): - _test_ufunc_inplace(op, *shape) - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/jumpy/tests/jumpy/test_conversion_32.py b/contrib/attic/jumpy/tests/jumpy/test_conversion_32.py deleted file mode 100644 index 14ffc9cf8..000000000 --- a/contrib/attic/jumpy/tests/jumpy/test_conversion_32.py +++ /dev/null @@ -1,47 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import pytest - -import jumpy as jp -import numpy as np -from numpy.testing import assert_allclose - - -def test_conversion_32(): - jp.set_context_dtype('float32') - shapes = [(1, 1), (2, 1), (1, 2), (32, 12), (100, 32, 16)] - for shape in shapes: - x_np = np.random.random(shape) - x_np = np.cast['float32'](x_np) - x_jp = jp.array(x_np) - x_np += np.cast['float32'](np.random.random(shape)) - x_jp = x_jp.numpy() - - assert_allclose(x_jp, x_np) - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/jumpy/tests/jumpy/test_conversion_64.py b/contrib/attic/jumpy/tests/jumpy/test_conversion_64.py deleted file mode 100644 index 6d8d3bc10..000000000 --- a/contrib/attic/jumpy/tests/jumpy/test_conversion_64.py +++ /dev/null @@ -1,45 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import pytest - -import jumpy as jp -import numpy as np -from numpy.testing import assert_allclose - - -def test_conversion_64(): - jp.set_context_dtype('float64') - shapes = [(1, 1), (2, 1), (1, 2), (32, 12), (100, 32, 16)] - for shape in shapes: - x_np = np.random.random(shape) - x_jp = jp.array(x_np) - x_np += np.random.random(shape) - x_jp = x_jp.numpy() - - assert_allclose(x_jp, x_np) - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/jumpy/tests/jumpy/test_reduction_ops.py b/contrib/attic/jumpy/tests/jumpy/test_reduction_ops.py deleted file mode 100644 index 313edc06e..000000000 --- a/contrib/attic/jumpy/tests/jumpy/test_reduction_ops.py +++ /dev/null @@ -1,57 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import pytest - -import jumpy as jp -import numpy as np -from numpy.testing import assert_allclose - - -def _test_reduction_op(op, shape): - for axis in range(len(shape)): - x_np = np.random.random(shape) - y_np = getattr(np, op)(x_np, axis=axis) - - x_jp = jp.array(x_np) - y_jp = getattr(jp, op)(x_jp, axis=axis) - - x_jp = x_jp.numpy() - - assert_allclose(x_jp, x_np) - - -def test_reduction_ops(): - shapes = [(2, 3), (2, 3, 4), (2, 3, 4, 5)] - reduction_ops = ['max', 'min', 'sum', 'prod', 'mean', - 'std', 'var', 'argmax', 'argmin'] - - for op in reduction_ops: - for shape in shapes: - _test_reduction_op(op, shape) - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/jumpy/tests/jumpy/test_shape_ops.py b/contrib/attic/jumpy/tests/jumpy/test_shape_ops.py deleted file mode 100644 index 5dfb022c7..000000000 --- a/contrib/attic/jumpy/tests/jumpy/test_shape_ops.py +++ /dev/null @@ -1,168 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import pytest - -import jumpy as jp -import numpy as np -from numpy.testing import assert_allclose - - -def test_reshape(): - jp.set_context_dtype('float64') - - shapes = [ - [(2, 3), (6, 1)], - [(1, 2, 3), (3, 2)], - [(3, 2, 1), (2, -1)], - [(3, 1, 2), (-1, 3, 1)] - ] - - for shape1, shape2 in shapes: - x_np = np.random.random(shape1) - y_np = np.reshape(x_np, shape2) - - x_jp = jp.array(x_np) - y_jp = jp.reshape(x_jp, shape2) - - assert y_jp.shape == y_np.shape - - -def test_transpose(): - shapes = [(2, 3), (3, 1), (2, 3, 4)] - for shape in shapes: - x_np = np.random.random(shape) - x_jp = jp.array(x_np) - - y_np = np.transpose(x_np) - y_jp = jp.transpose(x_jp) - - y_jp = y_jp.numpy() - - assert y_jp.shape == y_np.shape - - -def test_permute(): - shapes = [] - shapes.append([(2, 3), [0, 1], [1, 0]]) - shapes.append([(2, 1), [0, 1], [1, 0]]) - shapes.append([(2, 3, 4), [0, 1, 2], [0, 2, 1], [1, 0, 2], [1, 2, 0], [2, 0, 1], [2, 1, 0]]) - - for shape in shapes: - x_np = np.random.random(shape[0]) - x_jp = jp.array(x_np) - for dims in shape[1:]: - y_np = np.transpose(x_np, dims) - y_jp = jp.transpose(x_jp, dims) - assert y_jp.shape == y_np.shape - - -def test_expand_dims(): - shapes = [(2, 3), (2, 1), (2, 3, 4)] - for shape in shapes: - x_np = np.random.random(shape) - x_jp = jp.array(x_np) - for axis in range(len(shape) + 1): - y_np = np.expand_dims(x_np, axis) - y_jp = jp.expand_dims(x_jp, axis) - assert y_jp.shape == y_np.shape - - -def test_squeeze(): - shapes = [[2, 3, 1, 4], [2, 1, 3]] - for shape in shapes: - x_np = np.random.random(shape) - x_jp = jp.array(x_np) - axis = shape.index(1) - y_np = np.squeeze(x_np, axis) - y_jp = jp.squeeze(x_jp, axis) - assert y_jp.shape == y_np.shape - - -def test_concatenate(): - shapes = [ - [(2, 3, 4), (3, 3, 4), 0], - [(2, 3, 5), (2, 4, 5), 1], - [(3, 2, 4), (3, 2, 2), 2] - ] - - for shape in shapes: - x1_np = np.random.random(shape[0]) - x2_np = np.random.random(shape[1]) - - x1_jp = jp.array(x1_np) - x2_jp = jp.array(x2_np) - - axis = shape[2] - - y_np = np.concatenate([x1_np, x2_np], axis) - y_jp = jp.concatenate([x1_jp, x2_jp], axis) - - assert y_jp.shape == y_np.shape - - -def test_stack(): - shapes = [ - (2, 3), (2, 3, 4) - ] - - for shape in shapes: - x1_np = np.random.random(shape) - x2_np = np.random.random(shape) - - x1_jp = jp.array(x1_np) - x2_jp = jp.array(x2_np) - - for axis in range(len(shape)): - y_np = np.stack([x1_np, x2_np], axis) - y_jp = jp.stack([x1_jp, x2_jp], axis) - - assert y_jp.shape == y_np.shape - - -def test_tile(): - shapes = [ - (2, 3), (2, 3, 4) - ] - - repeats = [ - [3, 2], [3, 2, 2] - ] - - for i in range(len(shapes)): - shape = shapes[i] - rep = repeats[i] - - x_np = np.random.random(shape) - x_jp = jp.array(x_np) - - y_np = np.tile(x_np, rep) - y_jp = jp.tile(x_jp, rep) - - assert y_jp.shape == y_np.shape - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/jumpy/tests/jumpy/test_spark.py b/contrib/attic/jumpy/tests/jumpy/test_spark.py deleted file mode 100644 index 17d2b7480..000000000 --- a/contrib/attic/jumpy/tests/jumpy/test_spark.py +++ /dev/null @@ -1,134 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from numpy.testing import assert_allclose -from jumpy.spark import py2javaArrayRDD -from jumpy.spark import py2javaDatasetRDD -from jumpy.spark import java2pyArrayRDD -from jumpy.spark import java2pyDatasetRDD -from jumpy.java_classes import JDataset -from jumpy.spark import Dataset -from jumpy.java_classes import ArrayList -from numpy.testing import assert_allclose -from jnius import autoclass -import jumpy as jp -import numpy as np -import pyspark -import pytest - - - -SparkConf = autoclass('org.apache.spark.SparkConf') -SparkContext = autoclass('org.apache.spark.api.java.JavaSparkContext') - - - -class TestSparkConverters(object): - - @pytest.fixture(scope='module') - def java_sc(self): - config = SparkConf() - config.setAppName("test") - config.setMaster("local[*]") - return SparkContext(config) - - @pytest.fixture(scope='module') - def py_sc(self): - return pyspark.SparkContext(master='local[*]', appName='test') - - def test_java2py_array(self, java_sc, py_sc): - data = ArrayList() - - for _ in range(100): - arr = jp.array(np.random.random((32, 20))).array - data.add(arr) - - java_rdd = java_sc.parallelize(data) - py_rdd = java2pyArrayRDD(java_rdd, py_sc) - - data2 = py_rdd.collect() - - data = [data.get(i) for i in range(data.size())] - - assert len(data) == len(data2) - - for d1, d2 in zip(data, data2): - assert_allclose(jp.array(d1).numpy(), d2) - - - def test_py2java_array(self, java_sc, py_sc): - data = [np.random.random((32, 20)) for _ in range(100)] - - jdata = [jp.array(x) for x in data] # required - - py_rdd = py_sc.parallelize(data) - java_rdd = py2javaArrayRDD(py_rdd, java_sc) - - data2 = java_rdd.collect() - data2 = [data2.get(i) for i in range(data2.size())] - assert len(data) == len(data2) - for d1, d2 in zip(data, data2): - d2 = jp.array(d2).numpy() - assert_allclose(d1, d2) - - def test_java2py_dataset(self, java_sc, py_sc): - data = ArrayList() - - for _ in range(100): - arr = jp.array(np.random.random((32, 20))).array - ds = JDataset(arr, arr) - data.add(ds) - - java_rdd = java_sc.parallelize(data) - py_rdd = java2pyDatasetRDD(java_rdd, py_sc) - - data2 = py_rdd.collect() - - data = [data.get(i) for i in range(data.size())] - - assert len(data) == len(data2) - - for d1, d2 in zip(data, data2): - assert_allclose(jp.array(d1.getFeatures()).numpy(), d2.features.numpy()) - - def test_py2java_array(self, java_sc, py_sc): - data = [np.random.random((32, 20)) for _ in range(100)] - jdata = [jp.array(x) for x in data] # required - data = [Dataset(x, x) for x in data] - - - py_rdd = py_sc.parallelize(data) - java_rdd = py2javaDatasetRDD(py_rdd, java_sc) - - data2 = java_rdd.collect() - data2 = [data2.get(i) for i in range(data2.size())] - assert len(data) == len(data2) - for d1, d2 in zip(data, data2): - d2 = jp.array(d2.getFeatures()).numpy() - assert_allclose(d1.features.numpy(), d2) - - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/jumpy/tests/jumpy/test_ufuncs.py b/contrib/attic/jumpy/tests/jumpy/test_ufuncs.py deleted file mode 100644 index 12394701f..000000000 --- a/contrib/attic/jumpy/tests/jumpy/test_ufuncs.py +++ /dev/null @@ -1,76 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import pytest - -import jumpy as jp -import numpy as np -from numpy.testing import assert_allclose - - -def _test_ufunc(op, shape): - a_np = np.random.random(shape) - b_np = np.random.random(shape) - - c_np = eval('a_np {} b_np'.format(op)) - - a_jp = jp.array(a_np) - b_jp = jp.array(b_np) - - c_jp = eval('a_jp {} b_jp'.format(op)) - - c_jp = c_jp.numpy() - - assert_allclose(c_jp, c_np) - - -def _test_ufunc_inplace(op, shape): - a_np = np.random.random(shape) - b_np = np.random.random(shape) - a_np2 = a_np.copy() - exec('a_np {}= b_np'.format(op)) - - a_jp = jp.array(a_np2) - b_jp = jp.array(b_np) - - exec('a_jp {}= b_jp'.format(op)) - - a_jp = a_jp.numpy() - - assert_allclose(a_jp, a_np) - - -def test_ufuncs(): - jp.set_context_dtype('float64') - shapes = [(1, 1), (1, 2), (2, 2), (2, 3), (2, 3, 4)] - ops = ['+', '-', '/', '*'] # TODO: investigate issue with ** - for op in ops: - for shape in shapes: - _test_ufunc(op, shape) - _test_ufunc_inplace(op, shape) - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/pom.xml b/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/pom.xml deleted file mode 100644 index 423d85558..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/pom.xml +++ /dev/null @@ -1,55 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j-jdbc - 1.0.0-SNAPSHOT - - - nd4j-jdbc-api - - nd4j-jdbc-api - - - - com.mchange - c3p0 - 0.9.5.4 - - - org.nd4j - nd4j-api - - - - - - testresources - - - diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/driverfinder/DriverFinder.java b/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/driverfinder/DriverFinder.java deleted file mode 100644 index 29f61e1b9..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/driverfinder/DriverFinder.java +++ /dev/null @@ -1,93 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.jdbc.driverfinder; - -import lombok.extern.slf4j.Slf4j; -import org.nd4j.common.config.ND4JClassLoading; - -import java.io.IOException; -import java.io.InputStream; -import java.sql.Driver; -import java.util.HashSet; -import java.util.Objects; -import java.util.Properties; -import java.util.ServiceLoader; -import java.util.Set; - -/** - * JDBC Driver finder - * - * @author Adam Gibson - */ -@Slf4j -public class DriverFinder { - - public final static String ND4j_JDBC_PROPERTIES = "nd4j.jdbc.properties"; - public final static String JDBC_KEY = "jdbc.driver"; - private static Class clazz; - private static Driver driver; - - public static Driver getDriver() { - if (driver == null) { - if (clazz == null) - discoverDriverClazz(); - try { - driver = clazz.newInstance(); - } catch (InstantiationException | IllegalAccessException e) { - log.error("",e); - } - } - return driver; - } - - private static void discoverDriverClazz() { - //All JDBC4 compliant drivers support ServiceLoader mechanism for discovery - https://stackoverflow.com/a/18297412 - ServiceLoader drivers = ND4JClassLoading.loadService(Driver.class); - Set> driverClasses = new HashSet<>(); - for(Driver driver : drivers){ - driverClasses.add(driver.getClass()); - } - - if(driverClasses.isEmpty()){ - throw new IllegalStateException("No org.nd4j.jdbc drivers found on classpath via ServiceLoader"); - } - - if(driverClasses.size() != 1) { - InputStream i = DriverFinder.class.getResourceAsStream("/" + ND4j_JDBC_PROPERTIES); - if (i == null) - throw new IllegalStateException("Only one jdbc driver allowed on the class path"); - else { - Properties props = new Properties(); - try { - props.load(i); - } catch (IOException e) { - throw new RuntimeException(e); - } - - String jdbcKeyClassName = props.getProperty(JDBC_KEY); - Objects.requireNonNull(jdbcKeyClassName, "Unable to find jdbc driver. Please specify a " - + ND4j_JDBC_PROPERTIES + " with the key " + JDBC_KEY); - - DriverFinder.clazz = ND4JClassLoading.loadClassByName(jdbcKeyClassName); - Objects.requireNonNull(DriverFinder.clazz, "Unable to find jdbc driver. Please specify a " - + ND4j_JDBC_PROPERTIES + " with the key " + JDBC_KEY); - } - } - } -} diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/loader/api/JDBCNDArrayIO.java b/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/loader/api/JDBCNDArrayIO.java deleted file mode 100644 index 1f4be4b72..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/loader/api/JDBCNDArrayIO.java +++ /dev/null @@ -1,104 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.jdbc.loader.api; - -import org.nd4j.linalg.api.ndarray.INDArray; - -import java.io.IOException; -import java.sql.Blob; -import java.sql.SQLException; - -/** - * Load a complex ndarray via org.nd4j.jdbc - * - * @author Adam Gibson - */ -public interface JDBCNDArrayIO { - - - /** - * Loads an array for the given id. - * @param id - * @return - */ - INDArray loadArrayForId(String id) throws SQLException; - - /** - * Convert an ndarray to a blob - * - * @param toConvert the ndarray to convert - * @return the converted ndarray - */ - Blob convert(INDArray toConvert) throws SQLException, IOException; - - /** - * Load an ndarray from a blob - * - * @param blob the blob to load from - * @return the loaded ndarray - */ - INDArray load(Blob blob) throws IOException, SQLException; - - - /** - * Create an insert statement - * - * @return a new insert statement - */ - String insertStatement(); - - /** - * Create an insert statement - * - * @return a new insert statement - */ - String loadStatement(); - - - /** - * Create an insert statement - * - * @return a new insert statement - */ - String deleteStatement(); - - /** - * Save the ndarray - * - * @param save the ndarray to save - */ - void save(INDArray save, String id) throws SQLException, IOException; - - /** - * Load an ndarray blob given an id - * - * @param id the id to load - * @return the blob - */ - Blob loadForID(String id) throws SQLException; - - /** - * Delete the given ndarray - * - * @param id the id of the ndarray to delete - */ - void delete(String id) throws SQLException; - - -} diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/loader/impl/BaseLoader.java b/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/loader/impl/BaseLoader.java deleted file mode 100644 index 89d3c8957..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-api/src/main/java/org/nd4j/jdbc/loader/impl/BaseLoader.java +++ /dev/null @@ -1,193 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.jdbc.loader.impl; - -import com.mchange.v2.c3p0.ComboPooledDataSource; -import org.nd4j.jdbc.driverfinder.DriverFinder; -import org.nd4j.jdbc.loader.api.JDBCNDArrayIO; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.serde.binary.BinarySerde; - -import javax.sql.DataSource; -import java.io.*; -import java.nio.Buffer; -import java.nio.ByteBuffer; -import java.nio.channels.Channels; -import java.nio.channels.ReadableByteChannel; -import java.sql.*; - -/** - * Base class for loading ndarrays via org.nd4j.jdbc - * - * @author Adam Gibson - */ - -public abstract class BaseLoader implements JDBCNDArrayIO { - - protected String tableName, columnName, idColumnName, jdbcUrl; - protected DataSource dataSource; - - protected BaseLoader(DataSource dataSource, String jdbcUrl, String tableName, String idColumnName, - String columnName) throws Exception { - this.dataSource = dataSource; - this.jdbcUrl = jdbcUrl; - this.tableName = tableName; - this.columnName = columnName; - this.idColumnName = idColumnName; - if (dataSource == null) { - dataSource = new ComboPooledDataSource(); - ComboPooledDataSource c = (ComboPooledDataSource) dataSource; - c.setJdbcUrl(jdbcUrl); - c.setDriverClass(DriverFinder.getDriver().getClass().getName()); - - } - } - - - protected BaseLoader(String jdbcUrl, String tableName, String idColumnName, String columnName) throws Exception { - this.jdbcUrl = jdbcUrl; - this.tableName = tableName; - this.columnName = columnName; - dataSource = new ComboPooledDataSource(); - ComboPooledDataSource c = (ComboPooledDataSource) dataSource; - c.setJdbcUrl(jdbcUrl); - c.setDriverClass(DriverFinder.getDriver().getClass().getName()); - this.idColumnName = idColumnName; - - } - - protected BaseLoader(DataSource dataSource, String jdbcUrl, String tableName, String columnName) throws Exception { - this(dataSource, jdbcUrl, tableName, "id", columnName); - - } - - /** - * Convert an ndarray to a blob - * - * @param toConvert the ndarray to convert - * @return the converted ndarray - */ - @Override - public Blob convert(INDArray toConvert) throws SQLException { - ByteBuffer byteBuffer = BinarySerde.toByteBuffer(toConvert); - Buffer buffer = (Buffer) byteBuffer; - buffer.rewind(); - byte[] arr = new byte[byteBuffer.capacity()]; - byteBuffer.get(arr); - Connection c = dataSource.getConnection(); - Blob b = c.createBlob(); - b.setBytes(1, arr); - return b; - } - - /** - * Load an ndarray from a blob - * - * @param blob the blob to load from - * @return the loaded ndarray - */ - @Override - public INDArray load(Blob blob) throws SQLException { - if (blob == null) - return null; - try(InputStream is = blob.getBinaryStream()) { - ByteBuffer direct = ByteBuffer.allocateDirect((int) blob.length()); - ReadableByteChannel readableByteChannel = Channels.newChannel(is); - readableByteChannel.read(direct); - Buffer byteBuffer = (Buffer) direct; - byteBuffer.rewind(); - return BinarySerde.toArray(direct); - } catch (Exception e) { - throw new RuntimeException(e); - } - - - } - - /** - * Save the ndarray - * - * @param save the ndarray to save - */ - @Override - public void save(INDArray save, String id) throws SQLException, IOException { - doSave(save, id); - - } - - - private void doSave(INDArray save, String id) throws SQLException, IOException { - Connection c = dataSource.getConnection(); - ByteArrayOutputStream bos = new ByteArrayOutputStream(); - DataOutputStream dos = new DataOutputStream(bos); - BinarySerde.writeArrayToOutputStream(save,bos); - - byte[] bytes = bos.toByteArray(); - - PreparedStatement preparedStatement = c.prepareStatement(insertStatement()); - preparedStatement.setString(1, id); - preparedStatement.setBytes(2, bytes); - preparedStatement.executeUpdate(); - - - } - - - /** - * Load an ndarray blob given an id - * - * @param id the id to load - * @return the blob - */ - @Override - public Blob loadForID(String id) throws SQLException { - Connection c = dataSource.getConnection(); - PreparedStatement preparedStatement = c.prepareStatement(loadStatement()); - preparedStatement.setString(1, id); - ResultSet r = preparedStatement.executeQuery(); - if (r.wasNull() || !r.next()) { - return null; - } else { - Blob first = r.getBlob(2); - return first; - } - - - } - - @Override - public INDArray loadArrayForId(String id) throws SQLException { - return load(loadForID(id)); - } - - /** - * Delete the given ndarray - * - * @param id the id of the ndarray to delete - */ - @Override - public void delete(String id) throws SQLException { - Connection c = dataSource.getConnection(); - PreparedStatement p = c.prepareStatement(deleteStatement()); - p.setString(1, id); - p.execute(); - - - } -} diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/pom.xml b/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/pom.xml deleted file mode 100644 index 54e333495..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/pom.xml +++ /dev/null @@ -1,104 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j-jdbc - 1.0.0-SNAPSHOT - - - nd4j-jdbc-hsql - - nd4j-jdbc-hsql - - - 1.7 - 2.4.0 - - - - - junit - junit - - - commons-dbutils - commons-dbutils - ${commons-db.version} - test - - - org.nd4j - nd4j-jdbc-api - ${project.version} - - - org.hsqldb - hsqldb - ${hsqldb.version} - - - org.nd4j - nd4j-common-tests - - - - - - testresources - - - nd4j-testresources - - - nd4j-tests-cpu - - false - - - - org.nd4j - nd4j-native - ${project.version} - - - - - nd4j-tests-cuda - - false - - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - - - - - diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/main/java/org/nd4j/jdbc/hsql/HsqlLoader.java b/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/main/java/org/nd4j/jdbc/hsql/HsqlLoader.java deleted file mode 100644 index 445668b74..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/main/java/org/nd4j/jdbc/hsql/HsqlLoader.java +++ /dev/null @@ -1,78 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.jdbc.hsql; - -import org.nd4j.jdbc.loader.impl.BaseLoader; - -import javax.sql.DataSource; - -/** - * HSQLDB loader for ndarrays. - * - * @author Adam Gibson - */ -public class HsqlLoader extends BaseLoader { - - public HsqlLoader(DataSource dataSource, String jdbcUrl, String tableName, String idColumnName, String columnName) throws Exception { - super(dataSource, jdbcUrl, tableName, idColumnName, columnName); - } - - public HsqlLoader(String jdbcUrl, String tableName, String idColumnName, String columnName) throws Exception { - super(jdbcUrl, tableName, idColumnName, columnName); - } - - public HsqlLoader(DataSource dataSource, String jdbcUrl, String tableName, String columnName) throws Exception { - super(dataSource, jdbcUrl, tableName, columnName); - } - - - /** - * Create an insert statement - * - * @return a new insert statement - */ - @Override - public String insertStatement() { - return "INSERT INTO " + tableName + " VALUES(?,?)"; - } - - /** - * Create an insert statement. This should be a templated query. - * IE: Question mark at the end, we will take care of setting the proper value. - * - * @return a new insert statement - */ - @Override - public String loadStatement() { - return "SELECT * FROM " + tableName + " WHERE " + this.idColumnName + " =?"; - - - } - - /** - * Create an delete statement - * - * @return a new delete statement - */ - @Override - public String deleteStatement() { - return "DELETE FROM " + tableName + " WHERE " + this.idColumnName + " =?"; - - } -} diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/main/resources/nd4j.jdbc.properties b/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/main/resources/nd4j.jdbc.properties deleted file mode 100644 index d1c4c4b90..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/main/resources/nd4j.jdbc.properties +++ /dev/null @@ -1,21 +0,0 @@ -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -jdbc.driver=org.hsqldb.jdbc.JDBCDriver \ No newline at end of file diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/test/java/org/nd4j/jdbc/hsql/HSqlLoaderTest.java b/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/test/java/org/nd4j/jdbc/hsql/HSqlLoaderTest.java deleted file mode 100644 index 46348b27b..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-hsql/src/test/java/org/nd4j/jdbc/hsql/HSqlLoaderTest.java +++ /dev/null @@ -1,134 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.jdbc.hsql; - -import lombok.extern.slf4j.Slf4j; -import org.hsqldb.jdbc.JDBCDataSource; -import org.junit.AfterClass; -import org.junit.BeforeClass; -import org.junit.Test; -import org.nd4j.common.config.ND4JClassLoading; -import org.nd4j.common.tests.BaseND4JTest; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; - -import javax.sql.DataSource; -import java.sql.*; - -import static org.hamcrest.CoreMatchers.is; -import static org.junit.Assert.assertEquals; -import static org.junit.Assert.assertNotNull; -import static org.junit.Assert.assertThat; - -@Slf4j -public class HSqlLoaderTest extends BaseND4JTest { - private static HsqlLoader hsqlLoader; - private static DataSource dataSource; - - public final static String JDBC_URL = "jdbc:hsqldb:mem:ndarrays"; - public final static String TABLE_NAME = "testarrays"; - public final static String ID_COLUMN_NAME = "id"; - public final static String COLUMN_NAME = "array"; - - @BeforeClass - public static void init() throws Exception { - hsqlLoader = new HsqlLoader(dataSource(),JDBC_URL,TABLE_NAME,ID_COLUMN_NAME,COLUMN_NAME); - ND4JClassLoading.loadClassByName("org.hsqldb.jdbc.JDBCDriver"); - - // initialize database - initDatabase(); - } - - public static DataSource dataSource() { - if (dataSource != null) - return dataSource; - JDBCDataSource dataSource = new JDBCDataSource(); - dataSource.setDatabase(JDBC_URL); - dataSource.setUrl(JDBC_URL); - dataSource.setPassword("test"); - dataSource.setUser("test"); - HSqlLoaderTest.dataSource = dataSource; - return dataSource; - } - - @AfterClass - public static void destroy() throws SQLException { - try (Connection connection = getConnection(); Statement statement = connection.createStatement()) { - statement.executeUpdate("DROP TABLE " + TABLE_NAME); - connection.commit(); - } - } - - /** - * Database initialization for testing i.e. - *
    - *
  • Creating Table
  • - *
  • Inserting record
  • - *
- * - * @throws SQLException - */ - private static void initDatabase() throws Exception { - try (Connection connection = getConnection(); Statement statement = connection.createStatement()) { - statement.execute(String.format("CREATE TABLE %s (%s INT NOT NULL," - + " %s BLOB NOT NULL, PRIMARY KEY (id))",TABLE_NAME,ID_COLUMN_NAME,COLUMN_NAME)); - connection.commit(); - hsqlLoader.save(Nd4j.linspace(1,4,4, Nd4j.dataType()),"1"); - connection.commit(); - } - } - - /** - * Create a connection - * - * @return connection object - * @throws SQLException - */ - private static Connection getConnection() throws SQLException { - return DriverManager.getConnection(JDBC_URL, "test", "test"); - } - - /** - * Get total records in table - * - * @return total number of records. In case of exception 0 is returned - */ - private int getTotalRecords() { - try (Connection connection = getConnection(); Statement statement = connection.createStatement()) { - ResultSet result = statement.executeQuery(String.format("SELECT count(*) as total FROM %s",TABLE_NAME)); - if (result.next()) { - return result.getInt("total"); - } - } catch (SQLException e) { - log.error("",e); - } - return 0; - } - - @Test - public void getTotalRecordsTest() throws Exception { - assertThat(1, is(getTotalRecords())); - - INDArray load = hsqlLoader.load(hsqlLoader.loadForID("1")); - assertNotNull(load); - assertEquals(Nd4j.linspace(1,4,4, load.dataType()),load); - - - } -} diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/pom.xml b/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/pom.xml deleted file mode 100644 index 2092082ea..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/pom.xml +++ /dev/null @@ -1,57 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j-jdbc - 1.0.0-SNAPSHOT - - - nd4j-jdbc-mysql - - - - org.nd4j - nd4j-jdbc-api - ${project.version} - - - junit - junit - - - org.nd4j - nd4j-common-tests - - - - - - testresources - - - diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/main/java/org/nd4j/jdbc/mysql/MysqlLoader.java b/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/main/java/org/nd4j/jdbc/mysql/MysqlLoader.java deleted file mode 100644 index 00adad3ec..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/main/java/org/nd4j/jdbc/mysql/MysqlLoader.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.jdbc.mysql; - -import org.nd4j.jdbc.loader.impl.BaseLoader; - -import javax.sql.DataSource; - -/** - * Mysql loader for ndarrays - * - * @author Adam Gibson - */ -public class MysqlLoader extends BaseLoader { - - public MysqlLoader(DataSource dataSource, String jdbcUrl, String tableName, String columnName) throws Exception { - super(dataSource, jdbcUrl, tableName, columnName); - } - - /** - * Create an insert statement - * - * @return a new insert statement - */ - @Override - public String insertStatement() { - return "INSERT INTO " + tableName + " VALUES(?,?)"; - } - - /** - * Create an insert statement. This should be a templated query. - * IE: Question mark at the end, we will take care of setting the proper value. - * - * @return a new insert statement - */ - @Override - public String loadStatement() { - return "SELECT * FROM " + tableName + " WHERE " + this.idColumnName + " =?"; - - - } - - /** - * Create an delete statement - * - * @return a new delete statement - */ - @Override - public String deleteStatement() { - return "DELETE FROM " + tableName + " WHERE " + this.idColumnName + " =?"; - - } -} diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/main/resources/nd4j.jdbc.properties b/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/main/resources/nd4j.jdbc.properties deleted file mode 100644 index b11644c2b..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/main/resources/nd4j.jdbc.properties +++ /dev/null @@ -1,21 +0,0 @@ -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -jdbc.driver=com.mysql.jdbc.Driver \ No newline at end of file diff --git a/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/test/java/org/nd4j/jdbc/mysql/MysqlLoaderTest.java b/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/test/java/org/nd4j/jdbc/mysql/MysqlLoaderTest.java deleted file mode 100644 index 9b64eb065..000000000 --- a/contrib/attic/nd4j-jdbc/nd4j-jdbc-mysql/src/test/java/org/nd4j/jdbc/mysql/MysqlLoaderTest.java +++ /dev/null @@ -1,54 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.jdbc.mysql; - -import com.mchange.v2.c3p0.ComboPooledDataSource; -import org.junit.Ignore; -import org.junit.Test; -import org.nd4j.common.tests.BaseND4JTest; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; - -import java.sql.Blob; - -import static org.junit.Assert.assertEquals; - -public class MysqlLoaderTest extends BaseND4JTest { - - - //simple litmus test, unfortunately relies on an external database - @Test - @Ignore - public void testMysqlLoader() throws Exception { - ComboPooledDataSource ds = new ComboPooledDataSource(); - ds.setJdbcUrl("jdbc:mysql://localhost:3306/nd4j?user=nd4j&password=nd4j"); - MysqlLoader loader = new MysqlLoader(ds, "jdbc:mysql://localhost:3306/nd4j?user=nd4j&password=nd4j", "ndarrays", - "array"); - loader.delete("1"); - INDArray load = loader.load(loader.loadForID("1")); - if (load != null) { - loader.delete("1"); - } - loader.save(Nd4j.create(new float[] {1, 2, 3}), "1"); - Blob b = loader.loadForID("1"); - INDArray loaded = loader.load(b); - assertEquals((Nd4j.create(new float[] {1, 2, 3})), loaded); - } - -} diff --git a/contrib/attic/nd4j-jdbc/pom.xml b/contrib/attic/nd4j-jdbc/pom.xml deleted file mode 100644 index 885c2fd71..000000000 --- a/contrib/attic/nd4j-jdbc/pom.xml +++ /dev/null @@ -1,70 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j - 1.0.0-SNAPSHOT - - - nd4j-jdbc - pom - - nd4j-jdbc - - - nd4j-jdbc-api - nd4j-jdbc-mysql - nd4j-jdbc-hsql - - - - - - org.nd4j - nd4j-common-tests - ${project.version} - test - - - - - - - testresources - - - nd4j-testresources - - - nd4j-tests-cpu - - - nd4j-tests-cuda - - - diff --git a/contrib/attic/nd4j-remote/nd4j-grpc-client/pom.xml b/contrib/attic/nd4j-remote/nd4j-grpc-client/pom.xml deleted file mode 100644 index 1a0008f75..000000000 --- a/contrib/attic/nd4j-remote/nd4j-grpc-client/pom.xml +++ /dev/null @@ -1,105 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j-remote - 1.0.0-SNAPSHOT - - - nd4j-grpc-client - - nd4j-grpc - - - - org.nd4j - nd4j-api - - - com.google.flatbuffers - flatbuffers-java-grpc - ${flatbuffers.version} - - - - javax.annotation - javax.annotation-api - ${javax.annotation-api.version} - provided - - - io.grpc - grpc-all - ${grpc.version} - - - ch.qos.logback - logback-classic - test - - - ch.qos.logback - logback-core - test - - - org.nd4j - nd4j-common-tests - ${project.version} - test - - - - - - testresources - - - nd4j-tests-cpu - - - org.nd4j - nd4j-native - ${project.version} - test - - - - - nd4j-tests-cuda - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - test - - - - - diff --git a/contrib/attic/nd4j-remote/nd4j-grpc-client/src/main/java/org/nd4j/remote/grpc/GraphInferenceGrpcClient.java b/contrib/attic/nd4j-remote/nd4j-grpc-client/src/main/java/org/nd4j/remote/grpc/GraphInferenceGrpcClient.java deleted file mode 100644 index 622c32546..000000000 --- a/contrib/attic/nd4j-remote/nd4j-grpc-client/src/main/java/org/nd4j/remote/grpc/GraphInferenceGrpcClient.java +++ /dev/null @@ -1,213 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.grpc; - -import com.google.flatbuffers.FlatBufferBuilder; -import io.grpc.ManagedChannel; -import io.grpc.ManagedChannelBuilder; -import lombok.NonNull; -import lombok.extern.slf4j.Slf4j; -import lombok.val; -import org.nd4j.autodiff.execution.conf.ExecutorConfiguration; -import org.nd4j.autodiff.execution.input.OperandsAdapter; -import org.nd4j.autodiff.execution.input.Operands; -import org.nd4j.autodiff.samediff.SameDiff; -import org.nd4j.autodiff.samediff.serde.FlatBuffersMapper; -import org.nd4j.graph.FlatDropRequest; -import org.nd4j.graph.FlatInferenceRequest; -import org.nd4j.graph.FlatVariable; -import org.nd4j.graph.IntPair; -import org.nd4j.remote.grpc.grpc.GraphInferenceServerGrpc; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.exception.ND4JIllegalStateException; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.common.primitives.Pair; - -import java.util.ArrayList; -import java.util.concurrent.TimeUnit; - -/** - * This class is a wrapper over GraphServer gRPC complex - * - * @author raver119@gmail.com - */ -@Slf4j -public class GraphInferenceGrpcClient { - private final ManagedChannel channel; - private final GraphInferenceServerGrpc.GraphInferenceServerBlockingStub blockingStub; - - /** - * This method creates new GraphInferenceGrpcClient, with plain text connection - * @param host - * @param port - */ - public GraphInferenceGrpcClient(@NonNull String host, int port) { - this(host, port, false); - } - - /** - * This method creates new GraphInferenceGrpcClient, with optional TLS support - * @param host - * @param port - */ - public GraphInferenceGrpcClient(@NonNull String host, int port, boolean useTLS) { - this(useTLS ? ManagedChannelBuilder.forAddress(host, port).build() - : ManagedChannelBuilder.forAddress(host, port).usePlaintext().build()); - } - - /** - * This method creates new GraphInferenceGrpcClient over given ManagedChannel - * @param channel - */ - public GraphInferenceGrpcClient(@NonNull ManagedChannel channel) { - this.channel = channel; - this.blockingStub = GraphInferenceServerGrpc.newBlockingStub(this.channel); - } - - /** - * This method shuts down gRPC connection - * - * @throws InterruptedException - */ - public void shutdown() throws InterruptedException { - this.channel.shutdown().awaitTermination(10, TimeUnit.SECONDS); - } - - /** - * This method adds given graph to the GraphServer storage - * @param graph - */ - public void registerGraph(@NonNull SameDiff graph) { - blockingStub.registerGraph(graph.asFlatGraph(false)); - } - - /** - * This method adds given graph to the GraphServer storage - * - * PLEASE NOTE: You don't need to register graph more then once - * PLEASE NOTE: You don't need to register graph if GraphServer was used with -f argument - * @param graphId id of the graph, if not 0 - should be used in subsequent output() requests - * @param graph - * - */ - public void registerGraph(long graphId, @NonNull SameDiff graph, ExecutorConfiguration configuration) { - val g = graph.asFlatGraph(graphId, configuration, false); - val v = blockingStub.registerGraph(g); - if (v.status() != 0) - throw new ND4JIllegalStateException("registerGraph() gRPC call failed"); - } - - /** - * This method sends inference request to the GraphServer instance, and returns result as array of INDArrays - * - * PLEASE NOTE: This call will be routed to default graph with id 0 - * @param inputs graph inputs with their string ides - * @return - */ - public INDArray[] output(Pair... inputs) { - return output(0, inputs); - } - - /** - * This method is suited for use of custom OperandsAdapters - * @param adapter - * @param - * @return - */ - public T output(long graphId, T value, OperandsAdapter adapter) { - return adapter.output(this.output(graphId, adapter.input(value))); - } - - - public Operands output(long graphId, @NonNull Operands operands) { - val result = new ArrayList(); - val builder = new FlatBufferBuilder(1024); - - val ins = new int[operands.size()]; - - val col = operands.asCollection(); - - int cnt = 0; - for (val input: col) { - val id = input.getFirst(); - val array = input.getSecond(); - - val idPair = IntPair.createIntPair(builder, id.getId(), id.getIndex()); - val nameOff = id.getName() != null ? builder.createString(id.getName()) : 0; - - val arrOff = array.toFlatArray(builder); - byte variableType = 0; //TODO is this OK here? - val varOff = FlatVariable.createFlatVariable(builder, idPair, nameOff, FlatBuffersMapper.getDataTypeAsByte(array.dataType()),0, arrOff, -1, variableType, 0, 0, 0); - ins[cnt++] = varOff; - } - - val varsOff = FlatInferenceRequest.createVariablesVector(builder, ins); - - val off = FlatInferenceRequest.createFlatInferenceRequest(builder, graphId, varsOff, 0); - builder.finish(off); - - val req = FlatInferenceRequest.getRootAsFlatInferenceRequest(builder.dataBuffer()); - - val fr = blockingStub.inferenceRequest(req); - - val res = new Operands(); - - for (int e = 0; e < fr.variablesLength(); e++) { - val v = fr.variables(e); - - val array = Nd4j.createFromFlatArray(v.ndarray()); - res.addArgument(v.name(), array); - res.addArgument(v.id().first(), v.id().second(), array); - res.addArgument(v.name(), v.id().first(), v.id().second(), array); - } - - return res; - } - - /** - * This method sends inference request to the GraphServer instance, and returns result as array of INDArrays - * @param graphId id of the graph - * @param inputs graph inputs with their string ides - * @return - */ - public INDArray[] output(long graphId, Pair... inputs) { - val operands = new Operands(); - for (val in:inputs) - operands.addArgument(in.getFirst(), in.getSecond()); - - return output(graphId, operands).asArray(); - } - - /** - * This method allows to remove graph from the GraphServer instance - * @param graphId - */ - public void dropGraph(long graphId) { - val builder = new FlatBufferBuilder(128); - - val off = FlatDropRequest.createFlatDropRequest(builder, graphId); - builder.finish(off); - - val req = FlatDropRequest.getRootAsFlatDropRequest(builder.dataBuffer()); - - val v = blockingStub.forgetGraph(req); - if (v.status() != 0) - throw new ND4JIllegalStateException("registerGraph() gRPC call failed"); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-grpc-client/src/main/java/org/nd4j/remote/grpc/grpc/GraphInferenceServerGrpc.java b/contrib/attic/nd4j-remote/nd4j-grpc-client/src/main/java/org/nd4j/remote/grpc/grpc/GraphInferenceServerGrpc.java deleted file mode 100644 index 170f16f50..000000000 --- a/contrib/attic/nd4j-remote/nd4j-grpc-client/src/main/java/org/nd4j/remote/grpc/grpc/GraphInferenceServerGrpc.java +++ /dev/null @@ -1,561 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -//Generated by flatc compiler (version 1.10.0) -//If you make any local changes, they will be lost -//source: graph.fbs - -package org.nd4j.remote.grpc.grpc; - -import com.google.flatbuffers.grpc.FlatbuffersUtils; - -import java.nio.ByteBuffer; -import static io.grpc.MethodDescriptor.generateFullMethodName; -import static io.grpc.stub.ClientCalls.asyncUnaryCall; -import static io.grpc.stub.ClientCalls.blockingUnaryCall; -import static io.grpc.stub.ClientCalls.futureUnaryCall; -import static io.grpc.stub.ServerCalls.asyncUnaryCall; -import static io.grpc.stub.ServerCalls.asyncUnimplementedUnaryCall; - -/** - */ -@javax.annotation.Generated( - value = "by gRPC proto compiler", - comments = "Source: graph.fbs") -public final class GraphInferenceServerGrpc { - - private GraphInferenceServerGrpc() {} - - public static final String SERVICE_NAME = "org.nd4j.graph.GraphInferenceServer"; - - // Static method descriptors that strictly reflect the proto. - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - @Deprecated // Use {@link #getRegisterGraphMethod()} instead. - public static final io.grpc.MethodDescriptor METHOD_REGISTER_GRAPH = getRegisterGraphMethod(); - - private static volatile io.grpc.MethodDescriptor getRegisterGraphMethod; - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatGraph; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatGraph() { - if (extractorOfFlatGraph != null) return extractorOfFlatGraph; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatGraph != null) return extractorOfFlatGraph; - extractorOfFlatGraph = new FlatbuffersUtils.FBExtactor() { - public org.nd4j.graph.FlatGraph extract (ByteBuffer buffer) { - return org.nd4j.graph.FlatGraph.getRootAsFlatGraph(buffer); - } - }; - return extractorOfFlatGraph; - } - } - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatResponse; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatResponse() { - if (extractorOfFlatResponse != null) return extractorOfFlatResponse; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatResponse != null) return extractorOfFlatResponse; - extractorOfFlatResponse = new FlatbuffersUtils.FBExtactor() { - public org.nd4j.graph.FlatResponse extract (ByteBuffer buffer) { - return org.nd4j.graph.FlatResponse.getRootAsFlatResponse(buffer); - } - }; - return extractorOfFlatResponse; - } - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - public static io.grpc.MethodDescriptor getRegisterGraphMethod() { - io.grpc.MethodDescriptor getRegisterGraphMethod; - if ((getRegisterGraphMethod = GraphInferenceServerGrpc.getRegisterGraphMethod) == null) { - synchronized (GraphInferenceServerGrpc.class) { - if ((getRegisterGraphMethod = GraphInferenceServerGrpc.getRegisterGraphMethod) == null) { - GraphInferenceServerGrpc.getRegisterGraphMethod = getRegisterGraphMethod = - io.grpc.MethodDescriptor.newBuilder() - .setType(io.grpc.MethodDescriptor.MethodType.UNARY) - .setFullMethodName(generateFullMethodName( - "org.nd4j.graph.GraphInferenceServer", "RegisterGraph")) - .setSampledToLocalTracing(true) - .setRequestMarshaller(FlatbuffersUtils.marshaller( - org.nd4j.graph.FlatGraph.class, getExtractorOfFlatGraph())) - .setResponseMarshaller(FlatbuffersUtils.marshaller( - org.nd4j.graph.FlatResponse.class, getExtractorOfFlatResponse())) - .setSchemaDescriptor(null) - .build(); - } - } - } - return getRegisterGraphMethod; - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - @Deprecated // Use {@link #getForgetGraphMethod()} instead. - public static final io.grpc.MethodDescriptor METHOD_FORGET_GRAPH = getForgetGraphMethod(); - - private static volatile io.grpc.MethodDescriptor getForgetGraphMethod; - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatDropRequest; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatDropRequest() { - if (extractorOfFlatDropRequest != null) return extractorOfFlatDropRequest; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatDropRequest != null) return extractorOfFlatDropRequest; - extractorOfFlatDropRequest = new FlatbuffersUtils.FBExtactor() { - public org.nd4j.graph.FlatDropRequest extract (ByteBuffer buffer) { - return org.nd4j.graph.FlatDropRequest.getRootAsFlatDropRequest(buffer); - } - }; - return extractorOfFlatDropRequest; - } - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - public static io.grpc.MethodDescriptor getForgetGraphMethod() { - io.grpc.MethodDescriptor getForgetGraphMethod; - if ((getForgetGraphMethod = GraphInferenceServerGrpc.getForgetGraphMethod) == null) { - synchronized (GraphInferenceServerGrpc.class) { - if ((getForgetGraphMethod = GraphInferenceServerGrpc.getForgetGraphMethod) == null) { - GraphInferenceServerGrpc.getForgetGraphMethod = getForgetGraphMethod = - io.grpc.MethodDescriptor.newBuilder() - .setType(io.grpc.MethodDescriptor.MethodType.UNARY) - .setFullMethodName(generateFullMethodName( - "org.nd4j.graph.GraphInferenceServer", "ForgetGraph")) - .setSampledToLocalTracing(true) - .setRequestMarshaller(FlatbuffersUtils.marshaller( - org.nd4j.graph.FlatDropRequest.class, getExtractorOfFlatDropRequest())) - .setResponseMarshaller(FlatbuffersUtils.marshaller( - org.nd4j.graph.FlatResponse.class, getExtractorOfFlatResponse())) - .setSchemaDescriptor(null) - .build(); - } - } - } - return getForgetGraphMethod; - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - @Deprecated // Use {@link #getReplaceGraphMethod()} instead. - public static final io.grpc.MethodDescriptor METHOD_REPLACE_GRAPH = getReplaceGraphMethod(); - - private static volatile io.grpc.MethodDescriptor getReplaceGraphMethod; - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - public static io.grpc.MethodDescriptor getReplaceGraphMethod() { - io.grpc.MethodDescriptor getReplaceGraphMethod; - if ((getReplaceGraphMethod = GraphInferenceServerGrpc.getReplaceGraphMethod) == null) { - synchronized (GraphInferenceServerGrpc.class) { - if ((getReplaceGraphMethod = GraphInferenceServerGrpc.getReplaceGraphMethod) == null) { - GraphInferenceServerGrpc.getReplaceGraphMethod = getReplaceGraphMethod = - io.grpc.MethodDescriptor.newBuilder() - .setType(io.grpc.MethodDescriptor.MethodType.UNARY) - .setFullMethodName(generateFullMethodName( - "org.nd4j.graph.GraphInferenceServer", "ReplaceGraph")) - .setSampledToLocalTracing(true) - .setRequestMarshaller(FlatbuffersUtils.marshaller( - org.nd4j.graph.FlatGraph.class, getExtractorOfFlatGraph())) - .setResponseMarshaller(FlatbuffersUtils.marshaller( - org.nd4j.graph.FlatResponse.class, getExtractorOfFlatResponse())) - .setSchemaDescriptor(null) - .build(); - } - } - } - return getReplaceGraphMethod; - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - @Deprecated // Use {@link #getInferenceRequestMethod()} instead. - public static final io.grpc.MethodDescriptor METHOD_INFERENCE_REQUEST = getInferenceRequestMethod(); - - private static volatile io.grpc.MethodDescriptor getInferenceRequestMethod; - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatInferenceRequest; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatInferenceRequest() { - if (extractorOfFlatInferenceRequest != null) return extractorOfFlatInferenceRequest; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatInferenceRequest != null) return extractorOfFlatInferenceRequest; - extractorOfFlatInferenceRequest = new FlatbuffersUtils.FBExtactor() { - public org.nd4j.graph.FlatInferenceRequest extract (ByteBuffer buffer) { - return org.nd4j.graph.FlatInferenceRequest.getRootAsFlatInferenceRequest(buffer); - } - }; - return extractorOfFlatInferenceRequest; - } - } - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatResult; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatResult() { - if (extractorOfFlatResult != null) return extractorOfFlatResult; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatResult != null) return extractorOfFlatResult; - extractorOfFlatResult = new FlatbuffersUtils.FBExtactor() { - public org.nd4j.graph.FlatResult extract (ByteBuffer buffer) { - return org.nd4j.graph.FlatResult.getRootAsFlatResult(buffer); - } - }; - return extractorOfFlatResult; - } - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - public static io.grpc.MethodDescriptor getInferenceRequestMethod() { - io.grpc.MethodDescriptor getInferenceRequestMethod; - if ((getInferenceRequestMethod = GraphInferenceServerGrpc.getInferenceRequestMethod) == null) { - synchronized (GraphInferenceServerGrpc.class) { - if ((getInferenceRequestMethod = GraphInferenceServerGrpc.getInferenceRequestMethod) == null) { - GraphInferenceServerGrpc.getInferenceRequestMethod = getInferenceRequestMethod = - io.grpc.MethodDescriptor.newBuilder() - .setType(io.grpc.MethodDescriptor.MethodType.UNARY) - .setFullMethodName(generateFullMethodName( - "org.nd4j.graph.GraphInferenceServer", "InferenceRequest")) - .setSampledToLocalTracing(true) - .setRequestMarshaller(FlatbuffersUtils.marshaller( - org.nd4j.graph.FlatInferenceRequest.class, getExtractorOfFlatInferenceRequest())) - .setResponseMarshaller(FlatbuffersUtils.marshaller( - org.nd4j.graph.FlatResult.class, getExtractorOfFlatResult())) - .setSchemaDescriptor(null) - .build(); - } - } - } - return getInferenceRequestMethod; - } - - /** - * Creates a new async stub that supports all call types for the service - */ - public static GraphInferenceServerStub newStub(io.grpc.Channel channel) { - return new GraphInferenceServerStub(channel); - } - - /** - * Creates a new blocking-style stub that supports unary and streaming output calls on the service - */ - public static GraphInferenceServerBlockingStub newBlockingStub( - io.grpc.Channel channel) { - return new GraphInferenceServerBlockingStub(channel); - } - - /** - * Creates a new ListenableFuture-style stub that supports unary calls on the service - */ - public static GraphInferenceServerFutureStub newFutureStub( - io.grpc.Channel channel) { - return new GraphInferenceServerFutureStub(channel); - } - - /** - */ - public static abstract class GraphInferenceServerImplBase implements io.grpc.BindableService { - - /** - */ - public void registerGraph(org.nd4j.graph.FlatGraph request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnimplementedUnaryCall(getRegisterGraphMethod(), responseObserver); - } - - /** - */ - public void forgetGraph(org.nd4j.graph.FlatDropRequest request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnimplementedUnaryCall(getForgetGraphMethod(), responseObserver); - } - - /** - */ - public void replaceGraph(org.nd4j.graph.FlatGraph request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnimplementedUnaryCall(getReplaceGraphMethod(), responseObserver); - } - - /** - */ - public void inferenceRequest(org.nd4j.graph.FlatInferenceRequest request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnimplementedUnaryCall(getInferenceRequestMethod(), responseObserver); - } - - @Override public final io.grpc.ServerServiceDefinition bindService() { - return io.grpc.ServerServiceDefinition.builder(getServiceDescriptor()) - .addMethod( - getRegisterGraphMethod(), - asyncUnaryCall( - new MethodHandlers< - org.nd4j.graph.FlatGraph, - org.nd4j.graph.FlatResponse>( - this, METHODID_REGISTER_GRAPH))) - .addMethod( - getForgetGraphMethod(), - asyncUnaryCall( - new MethodHandlers< - org.nd4j.graph.FlatDropRequest, - org.nd4j.graph.FlatResponse>( - this, METHODID_FORGET_GRAPH))) - .addMethod( - getReplaceGraphMethod(), - asyncUnaryCall( - new MethodHandlers< - org.nd4j.graph.FlatGraph, - org.nd4j.graph.FlatResponse>( - this, METHODID_REPLACE_GRAPH))) - .addMethod( - getInferenceRequestMethod(), - asyncUnaryCall( - new MethodHandlers< - org.nd4j.graph.FlatInferenceRequest, - org.nd4j.graph.FlatResult>( - this, METHODID_INFERENCE_REQUEST))) - .build(); - } - } - - /** - */ - public static final class GraphInferenceServerStub extends io.grpc.stub.AbstractStub { - private GraphInferenceServerStub(io.grpc.Channel channel) { - super(channel); - } - - private GraphInferenceServerStub(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - super(channel, callOptions); - } - - @Override - protected GraphInferenceServerStub build(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - return new GraphInferenceServerStub(channel, callOptions); - } - - /** - */ - public void registerGraph(org.nd4j.graph.FlatGraph request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnaryCall( - getChannel().newCall(getRegisterGraphMethod(), getCallOptions()), request, responseObserver); - } - - /** - */ - public void forgetGraph(org.nd4j.graph.FlatDropRequest request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnaryCall( - getChannel().newCall(getForgetGraphMethod(), getCallOptions()), request, responseObserver); - } - - /** - */ - public void replaceGraph(org.nd4j.graph.FlatGraph request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnaryCall( - getChannel().newCall(getReplaceGraphMethod(), getCallOptions()), request, responseObserver); - } - - /** - */ - public void inferenceRequest(org.nd4j.graph.FlatInferenceRequest request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnaryCall( - getChannel().newCall(getInferenceRequestMethod(), getCallOptions()), request, responseObserver); - } - } - - /** - */ - public static final class GraphInferenceServerBlockingStub extends io.grpc.stub.AbstractStub { - private GraphInferenceServerBlockingStub(io.grpc.Channel channel) { - super(channel); - } - - private GraphInferenceServerBlockingStub(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - super(channel, callOptions); - } - - @Override - protected GraphInferenceServerBlockingStub build(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - return new GraphInferenceServerBlockingStub(channel, callOptions); - } - - /** - */ - public org.nd4j.graph.FlatResponse registerGraph(org.nd4j.graph.FlatGraph request) { - return blockingUnaryCall( - getChannel(), getRegisterGraphMethod(), getCallOptions(), request); - } - - /** - */ - public org.nd4j.graph.FlatResponse forgetGraph(org.nd4j.graph.FlatDropRequest request) { - return blockingUnaryCall( - getChannel(), getForgetGraphMethod(), getCallOptions(), request); - } - - /** - */ - public org.nd4j.graph.FlatResponse replaceGraph(org.nd4j.graph.FlatGraph request) { - return blockingUnaryCall( - getChannel(), getReplaceGraphMethod(), getCallOptions(), request); - } - - /** - */ - public org.nd4j.graph.FlatResult inferenceRequest(org.nd4j.graph.FlatInferenceRequest request) { - return blockingUnaryCall( - getChannel(), getInferenceRequestMethod(), getCallOptions(), request); - } - } - - /** - */ - public static final class GraphInferenceServerFutureStub extends io.grpc.stub.AbstractStub { - private GraphInferenceServerFutureStub(io.grpc.Channel channel) { - super(channel); - } - - private GraphInferenceServerFutureStub(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - super(channel, callOptions); - } - - @Override - protected GraphInferenceServerFutureStub build(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - return new GraphInferenceServerFutureStub(channel, callOptions); - } - - /** - */ - public com.google.common.util.concurrent.ListenableFuture registerGraph( - org.nd4j.graph.FlatGraph request) { - return futureUnaryCall( - getChannel().newCall(getRegisterGraphMethod(), getCallOptions()), request); - } - - /** - */ - public com.google.common.util.concurrent.ListenableFuture forgetGraph( - org.nd4j.graph.FlatDropRequest request) { - return futureUnaryCall( - getChannel().newCall(getForgetGraphMethod(), getCallOptions()), request); - } - - /** - */ - public com.google.common.util.concurrent.ListenableFuture replaceGraph( - org.nd4j.graph.FlatGraph request) { - return futureUnaryCall( - getChannel().newCall(getReplaceGraphMethod(), getCallOptions()), request); - } - - /** - */ - public com.google.common.util.concurrent.ListenableFuture inferenceRequest( - org.nd4j.graph.FlatInferenceRequest request) { - return futureUnaryCall( - getChannel().newCall(getInferenceRequestMethod(), getCallOptions()), request); - } - } - - private static final int METHODID_REGISTER_GRAPH = 0; - private static final int METHODID_FORGET_GRAPH = 1; - private static final int METHODID_REPLACE_GRAPH = 2; - private static final int METHODID_INFERENCE_REQUEST = 3; - - private static final class MethodHandlers implements - io.grpc.stub.ServerCalls.UnaryMethod, - io.grpc.stub.ServerCalls.ServerStreamingMethod, - io.grpc.stub.ServerCalls.ClientStreamingMethod, - io.grpc.stub.ServerCalls.BidiStreamingMethod { - private final GraphInferenceServerImplBase serviceImpl; - private final int methodId; - - MethodHandlers(GraphInferenceServerImplBase serviceImpl, int methodId) { - this.serviceImpl = serviceImpl; - this.methodId = methodId; - } - - @Override - @SuppressWarnings("unchecked") - public void invoke(Req request, io.grpc.stub.StreamObserver responseObserver) { - switch (methodId) { - case METHODID_REGISTER_GRAPH: - serviceImpl.registerGraph((org.nd4j.graph.FlatGraph) request, - (io.grpc.stub.StreamObserver) responseObserver); - break; - case METHODID_FORGET_GRAPH: - serviceImpl.forgetGraph((org.nd4j.graph.FlatDropRequest) request, - (io.grpc.stub.StreamObserver) responseObserver); - break; - case METHODID_REPLACE_GRAPH: - serviceImpl.replaceGraph((org.nd4j.graph.FlatGraph) request, - (io.grpc.stub.StreamObserver) responseObserver); - break; - case METHODID_INFERENCE_REQUEST: - serviceImpl.inferenceRequest((org.nd4j.graph.FlatInferenceRequest) request, - (io.grpc.stub.StreamObserver) responseObserver); - break; - default: - throw new AssertionError(); - } - } - - @Override - @SuppressWarnings("unchecked") - public io.grpc.stub.StreamObserver invoke( - io.grpc.stub.StreamObserver responseObserver) { - switch (methodId) { - default: - throw new AssertionError(); - } - } - } - - private static volatile io.grpc.ServiceDescriptor serviceDescriptor; - - public static io.grpc.ServiceDescriptor getServiceDescriptor() { - io.grpc.ServiceDescriptor result = serviceDescriptor; - if (result == null) { - synchronized (GraphInferenceServerGrpc.class) { - result = serviceDescriptor; - if (result == null) { - serviceDescriptor = result = io.grpc.ServiceDescriptor.newBuilder(SERVICE_NAME) - .setSchemaDescriptor(null) - .addMethod(getRegisterGraphMethod()) - .addMethod(getForgetGraphMethod()) - .addMethod(getReplaceGraphMethod()) - .addMethod(getInferenceRequestMethod()) - .build(); - } - } - } - return result; - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-grpc-client/src/test/java/org/nd4j/graph/GraphInferenceGrpcClientTest.java b/contrib/attic/nd4j-remote/nd4j-grpc-client/src/test/java/org/nd4j/graph/GraphInferenceGrpcClientTest.java deleted file mode 100644 index 72ab97341..000000000 --- a/contrib/attic/nd4j-remote/nd4j-grpc-client/src/test/java/org/nd4j/graph/GraphInferenceGrpcClientTest.java +++ /dev/null @@ -1,85 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.graph; - -import lombok.extern.slf4j.Slf4j; -import lombok.val; -import org.apache.commons.lang3.RandomUtils; -import org.junit.Ignore; -import org.junit.Test; -import org.nd4j.common.tests.BaseND4JTest; -import org.nd4j.autodiff.execution.conf.ExecutorConfiguration; -import org.nd4j.autodiff.execution.conf.OutputMode; -import org.nd4j.autodiff.execution.input.Operands; -import org.nd4j.imports.graphmapper.tf.TFGraphMapper; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.common.io.ClassPathResource; -import org.nd4j.remote.grpc.GraphInferenceGrpcClient; - -import static org.junit.Assert.*; - -@Slf4j -@Ignore -public class GraphInferenceGrpcClientTest extends BaseND4JTest { - @Test - public void testSimpleGraph_1() throws Exception { - val exp = Nd4j.create(new double[] {-0.95938617, -1.20301781, 1.22260064, 0.50172403, 0.59972949, 0.78568028, 0.31609724, 1.51674747, 0.68013491, -0.05227458, 0.25903158,1.13243439}, new long[]{3, 1, 4}); - - // configuring client - val client = new GraphInferenceGrpcClient("127.0.0.1", 40123); - - val graphId = RandomUtils.nextLong(0, Long.MAX_VALUE); - - // preparing and registering graph (it's optional, and graph might be embedded into Docker image - val tg = TFGraphMapper.importGraph(new ClassPathResource("tf_graphs/examples/expand_dim/frozen_model.pb").getInputStream()); - assertNotNull(tg); - client.registerGraph(graphId, tg, ExecutorConfiguration.builder().outputMode(OutputMode.IMPLICIT).build()); - - //defining input - val input0 = Nd4j.create(new double[] {0.09753360, 0.76124972, 0.24693797, 0.13813169, 0.33144656, 0.08299957, 0.67197708, 0.80659380, 0.98274191, 0.63566073, 0.21592326, 0.54902743}, new int[] {3, 4}); - val operands = new Operands().addArgument("input_0", input0); - - // sending request and getting result - val result = client.output(graphId, operands); - assertEquals(exp, result.getById("output")); - } - - @Test - public void testSimpleGraph_2() throws Exception { - val exp = Nd4j.create(new double[] {-0.95938617, -1.20301781, 1.22260064, 0.50172403, 0.59972949, 0.78568028, 0.31609724, 1.51674747, 0.68013491, -0.05227458, 0.25903158,1.13243439}, new long[]{3, 1, 4}); - - // configuring client - val client = new GraphInferenceGrpcClient("127.0.0.1", 40123); - - val graphId = RandomUtils.nextLong(0, Long.MAX_VALUE); - - // preparing and registering graph (it's optional, and graph might be embedded into Docker image - val tg = TFGraphMapper.importGraph(new ClassPathResource("tf_graphs/examples/expand_dim/frozen_model.pb").getInputStream()); - assertNotNull(tg); - client.registerGraph(graphId, tg, ExecutorConfiguration.builder().outputMode(OutputMode.IMPLICIT).build()); - - //defining input - val input0 = Nd4j.create(new double[] {0.09753360, 0.76124972, 0.24693797, 0.13813169, 0.33144656, 0.08299957, 0.67197708, 0.80659380, 0.98274191, 0.63566073, 0.21592326, 0.54902743}, new int[] {3, 4}); - val operands = new Operands().addArgument(1, 0, input0); - - // sending request and getting result - val result = client.output(graphId, operands); - assertEquals(exp, result.getById("output")); - } -} \ No newline at end of file diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/pom.xml b/contrib/attic/nd4j-remote/nd4j-json-client/pom.xml deleted file mode 100644 index dc2a0b836..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/pom.xml +++ /dev/null @@ -1,117 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j-remote - 1.0.0-SNAPSHOT - - - nd4j-json-client - - nd4j-json-client - - - - com.mashape.unirest - unirest-java - ${unirest.version} - - - org.slf4j - slf4j-api - - - org.nd4j - jackson - ${project.version} - - - - - - testresources - - - nd4j-testresources - - - nd4j-tests-cpu - - false - - - - org.nd4j - nd4j-native - ${project.version} - - - - - - org.apache.maven.plugins - maven-surefire-plugin - - src/test/java - - *.java - **/*.java - - -Ddtype=float -Xmx8g - - - - - - - - nd4j-tests-cuda - - false - - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - - - - - - org.apache.maven.plugins - maven-surefire-plugin - - true - - - - - - - diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/JsonRemoteInference.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/JsonRemoteInference.java deleted file mode 100644 index 49244bfaa..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/JsonRemoteInference.java +++ /dev/null @@ -1,219 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients; - -import com.mashape.unirest.http.HttpResponse; -import com.mashape.unirest.http.Unirest; -import com.mashape.unirest.http.exceptions.UnirestException; -import lombok.Builder; -import lombok.NonNull; -import lombok.extern.slf4j.Slf4j; -import lombok.val; -import org.json.JSONObject; -import org.nd4j.remote.clients.serde.*; - -import java.io.IOException; -import java.io.InputStream; -import java.util.List; -import java.util.concurrent.ExecutionException; -import java.util.concurrent.Future; -import java.util.concurrent.TimeUnit; -import java.util.concurrent.TimeoutException; - -/** - * This class provides remote inference functionality via JSON-powered REST APIs. - * - * Basically we assume that there's remote JSON server available (on bare metal or in k8s/swarm/whatever cluster), and with proper serializers/deserializers provided we can issue REST requests and get back responses. - * So, in this way application logic can be separated from DL logic. - * - * You just need to provide serializer/deserializer and address of the REST server, i.e. "http://model:8080/v1/serving" - * - * @param type of the input class, i.e. String - * @param type of the output class, i.e. Sentiment - * - * @author raver119@gmail.com - */ -@Slf4j -public class JsonRemoteInference { - private String endpointAddress; - // JSON serializer/deserializer and binary serializer/deserializer are mutually exclusive. - private JsonSerializer serializer; - private JsonDeserializer deserializer; - private BinarySerializer binarySerializer; - private BinaryDeserializer binaryDeserializer; - - private final static String APPLICATION_JSON = "application/json"; - private final static String APPLICATION_OCTET_STREAM = "application/octet-stream"; - - @Builder - public JsonRemoteInference(@NonNull String endpointAddress, - JsonSerializer inputSerializer, JsonDeserializer outputDeserializer, - BinarySerializer inputBinarySerializer, BinaryDeserializer outputBinaryDeserializer) { - - this.endpointAddress = endpointAddress; - this.serializer = inputSerializer; - this.deserializer = outputDeserializer; - this.binarySerializer = inputBinarySerializer; - this.binaryDeserializer = outputBinaryDeserializer; - - if (serializer != null && binarySerializer != null || serializer == null && binarySerializer == null) - throw new IllegalStateException("Binary and JSON serializers/deserializers are mutually exclusive and mandatory."); - } - - - private O processResponse(HttpResponse response) throws IOException { - if (response.getStatus() != 200) - throw new IOException("Inference request returned bad error code: " + response.getStatus()); - - O result = deserializer.deserialize(response.getBody()); - - if (result == null) { - throw new IOException("Deserialization failed!"); - } - return result; - } - - private O processResponseBinary(HttpResponse response) throws IOException { - if (response.getStatus() != 200) - throw new IOException("Inference request returned bad error code: " + response.getStatus()); - - List values = response.getHeaders().get("Content-Length"); - if (values == null || values.size() < 1) { - throw new IOException("Content-Length is required for binary data"); - } - - String strLength = values.get(0); - byte[] bytes = new byte[Integer.parseInt(strLength)]; - response.getBody().read(bytes); - O result = binaryDeserializer.deserialize(bytes); - - if (result == null) { - throw new IOException("Deserialization failed!"); - } - return result; - } - - /** - * This method does remote inference in a blocking way - * - * @param input - * @return - * @throws IOException - */ - public O predict(I input) throws IOException { - try { - if (binarySerializer != null && binaryDeserializer != null) { - HttpResponse response = - Unirest.post(endpointAddress) - .header("Content-Type", APPLICATION_OCTET_STREAM) - .header("Accept", APPLICATION_OCTET_STREAM) - .body(binarySerializer.serialize(input)).asBinary(); - return processResponseBinary(response); - } - else if (binarySerializer != null && binaryDeserializer == null) { - HttpResponse response = - Unirest.post(endpointAddress) - .header("Content-Type", APPLICATION_OCTET_STREAM) - .header("Accept", APPLICATION_OCTET_STREAM) - .body(binarySerializer.serialize(input)).asString(); - return processResponse(response); - } - else { - HttpResponse response = Unirest.post(endpointAddress) - .header("Content-Type", APPLICATION_JSON) - .header("Accept", APPLICATION_JSON) - .body(new JSONObject(serializer.serialize(input))).asString(); - return processResponse(response); - } - - } catch (UnirestException e) { - throw new IOException(e); - } - } - - /** - * This method does remote inference in asynchronous way, returning Future instead - * @param input - * @return - */ - public Future predictAsync(I input) { - - Future> response = binarySerializer != null ? - Unirest.post(endpointAddress) - .header("Content-Type", "application/octet-stream") - .header("Accept", "application/octet-stream") - .body(binarySerializer.serialize(input)).asStringAsync() : - - Unirest.post(endpointAddress) - .header("Content-Type", "application/json") - .header("Accept", "application/json") - .body(new JSONObject(serializer.serialize(input))).asStringAsync(); - - return new InferenceFuture(response); - } - - /** - * This class holds a Future of the object returned by remote inference server - */ - private class InferenceFuture implements Future { - private Future> unirestFuture; - - private InferenceFuture(@NonNull Future> future) { - this.unirestFuture = future; - } - - @Override - public boolean cancel(boolean mayInterruptIfRunning) { - return unirestFuture.cancel(mayInterruptIfRunning); - } - - @Override - public boolean isCancelled() { - return unirestFuture.isCancelled(); - } - - @Override - public boolean isDone() { - return unirestFuture.isDone(); - } - - @Override - public O get() throws InterruptedException, ExecutionException { - val stringResult = unirestFuture.get(); - - try { - return processResponse(stringResult); - } catch (IOException e) { - throw new ExecutionException(e); - } - } - - @Override - public O get(long timeout, TimeUnit unit) throws InterruptedException, ExecutionException, TimeoutException { - val stringResult = unirestFuture.get(timeout, unit); - - try { - return processResponse(stringResult); - } catch (IOException e) { - throw new ExecutionException(e); - } - } - } -} - diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/BinaryDeserializer.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/BinaryDeserializer.java deleted file mode 100644 index 9917a85ce..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/BinaryDeserializer.java +++ /dev/null @@ -1,34 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4j.remote.clients.serde; - -/** - * This interface describes basic binary deserializer interface used for remote inference - * @param type of the deserializable class - * - * @author Alexander Stoyakin - */ -public interface BinaryDeserializer { - - /** - * This method deserializes binary data to arbitrary object. - * @param byte buffer - * @return deserialized object - */ - T deserialize(byte[] buffer); -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/BinarySerializer.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/BinarySerializer.java deleted file mode 100644 index 3e35a8058..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/BinarySerializer.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde; - -/** - * This interface describes basic binary serializer interface used for remote inference - * @param type of the serializable class - * - * @author Alexander Stoyakin - */ -public interface BinarySerializer { - - /** - * This method serializes given object into byte buffer - * - * @param o object to be serialized - * @return - */ - byte[] serialize(T o); -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/JsonDeserializer.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/JsonDeserializer.java deleted file mode 100644 index 14c658296..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/JsonDeserializer.java +++ /dev/null @@ -1,35 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde; - -/** - * This interface describes basic JSON deserializer interface used for JsonRemoteInference - * @param type of the deserializable class - * - * @author raver119@gmail.com - */ -public interface JsonDeserializer { - - /** - * This method serializes given object into JSON-string - * @param json string containing JSON representation of the object - * @return - */ - T deserialize(String json); -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/JsonSerializer.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/JsonSerializer.java deleted file mode 100644 index e1299d0f0..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/JsonSerializer.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde; - -/** - * This interface describes basic JSON serializer interface used for JsonRemoteInference - * @param type of the serializable class - * - * @author raver119@gmail.com - */ -public interface JsonSerializer { - - /** - * This method serializes given object into JSON-string - * - * @param o object to be serialized - * @return - */ - String serialize(T o); -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/AbstractSerDe.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/AbstractSerDe.java deleted file mode 100644 index 707c79d05..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/AbstractSerDe.java +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde.impl; - -import lombok.NonNull; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; -import org.nd4j.shade.jackson.core.JsonProcessingException; -import org.nd4j.shade.jackson.databind.ObjectMapper; - -import java.io.IOException; - -public abstract class AbstractSerDe implements JsonDeserializer, JsonSerializer { - protected ObjectMapper objectMapper = new ObjectMapper(); - - - protected String serializeClass(@NonNull T obj) { - try { - return objectMapper.writeValueAsString(obj); - } catch (JsonProcessingException e) { - throw new RuntimeException(e); - } - } - - protected T deserializeClass(@NonNull String json, @NonNull Class cls) { - try { - return objectMapper.readValue(json, cls); - } catch (IOException e) { - throw new RuntimeException(e); - } - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/BooleanSerde.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/BooleanSerde.java deleted file mode 100644 index bb0a1df79..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/BooleanSerde.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde.impl; - -import lombok.NonNull; - -/** - * This class provides JSON ser/de for Java Boolean. Single value only. - */ -public class BooleanSerde extends AbstractSerDe { - @Override - public Boolean deserialize(@NonNull String json) { - return deserializeClass(json, Boolean.class); - } - - @Override - public String serialize(@NonNull Boolean o) { - return serializeClass(o); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/DoubleArraySerde.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/DoubleArraySerde.java deleted file mode 100644 index 7839d72ca..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/DoubleArraySerde.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde.impl; - -import lombok.*; -/** - * This class provides JSON ser/de for Java double[] - */ -public class DoubleArraySerde extends AbstractSerDe { - - @Override - public String serialize(@NonNull double[] data) { - return serializeClass(data); - } - - @Override - public double[] deserialize(@NonNull String json) { - return deserializeClass(json, double[].class); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/DoubleSerde.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/DoubleSerde.java deleted file mode 100644 index 6eae3714e..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/DoubleSerde.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde.impl; - -import lombok.NonNull; - -/** - * This class provides JSON ser/de for Java Double. Single value only. - */ -public class DoubleSerde extends AbstractSerDe { - @Override - public Double deserialize(@NonNull String json) { - return deserializeClass(json, Double.class); - } - - @Override - public String serialize(@NonNull Double o) { - return serializeClass(o); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/FloatArraySerde.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/FloatArraySerde.java deleted file mode 100644 index 396dacce9..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/FloatArraySerde.java +++ /dev/null @@ -1,40 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4j.remote.clients.serde.impl; - - -import lombok.*; -import org.nd4j.shade.jackson.core.JsonProcessingException; -import org.nd4j.shade.jackson.databind.ObjectMapper; - - -/** - * This class provides JSON ser/de for Java float[] - */ -public class FloatArraySerde extends AbstractSerDe { - - @Override - public String serialize(@NonNull float[] data) { - return serializeClass(data); - } - - @Override - public float[] deserialize(@NonNull String json) { - return deserializeClass(json, float[].class); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/FloatSerde.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/FloatSerde.java deleted file mode 100644 index 1d087d228..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/FloatSerde.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde.impl; - -import lombok.NonNull; - -/** - * This class provides JSON ser/de for Java Float. Single value only. - */ -public class FloatSerde extends AbstractSerDe { - @Override - public Float deserialize(@NonNull String json) { - return deserializeClass(json, Float.class); - } - - @Override - public String serialize(@NonNull Float o) { - return serializeClass(o); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/IntegerSerde.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/IntegerSerde.java deleted file mode 100644 index 81dff8561..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/IntegerSerde.java +++ /dev/null @@ -1,36 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde.impl; - -import lombok.NonNull; - -/** - * This class provides JSON ser/de for Java Integer. Single value only. - */ -public class IntegerSerde extends AbstractSerDe { - @Override - public Integer deserialize(@NonNull String json) { - return deserializeClass(json, Integer.class); - } - - @Override - public String serialize(@NonNull Integer o) { - return serializeClass(o); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/StringSerde.java b/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/StringSerde.java deleted file mode 100644 index 0d94ed7b6..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-client/src/main/java/org/nd4j/remote/clients/serde/impl/StringSerde.java +++ /dev/null @@ -1,40 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.clients.serde.impl; - -import lombok.NonNull; -import org.nd4j.shade.jackson.core.JsonProcessingException; -import org.nd4j.shade.jackson.databind.ObjectMapper; - -/** - * This class provides fake JSON serializer/deserializer functionality for String. - * It doesn't put any JSON-specific bits into actual string - */ -public class StringSerde extends AbstractSerDe { - - @Override - public String serialize(@NonNull String data) { - return serializeClass(data); - } - - @Override - public String deserialize(@NonNull String json) { - return deserializeClass(json, String.class); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/README.md b/contrib/attic/nd4j-remote/nd4j-json-server/README.md deleted file mode 100644 index 963890a7b..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/README.md +++ /dev/null @@ -1,35 +0,0 @@ -## SameDiff model serving - -This modules provides JSON-based serving of SameDiff models - -## Example - -First of all we'll create server instance. Most probably you'll do it in application that will be running in container -```java -val server = SameDiffJsonModelServer.builder() - .adapter(new StringToSentimentAdapter()) - .model(mySameDiffModel) - .port(8080) - .serializer(new SentimentSerializer()) - .deserializer(new StringDeserializer()) - .build(); - -server.start(); -server.join(); -``` - -Now, presumably in some other container, we'll set up remote inference client: -```java -val client = JsonRemoteInference.builder() - .endpointAddress("http://youraddress:8080/v1/serving") - .serializer(new StringSerializer()) - .deserializer(new SentimentDeserializer()) - .build(); - -Sentiment result = client.predict(myText); -``` - On top of that, there's async call available, for cases when you need to chain multiple requests to one or multiple remote model servers. - -```java -Future result = client.predictAsync(myText); -``` \ No newline at end of file diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/pom.xml b/contrib/attic/nd4j-remote/nd4j-json-server/pom.xml deleted file mode 100644 index 65f3548e0..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/pom.xml +++ /dev/null @@ -1,161 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j-remote - 1.0.0-SNAPSHOT - - - nd4j-json-server - jar - - nd4j-json-server - - - - org.nd4j - nd4j-json-client - ${project.version} - - - org.slf4j - slf4j-api - - - org.nd4j - nd4j-api - - - org.glassfish.jersey.core - jersey-client - ${jersey.version} - - - org.glassfish.jersey.core - jersey-server - ${jersey.version} - - - org.eclipse.jetty - jetty-server - - 9.4.19.v20190610 - - - org.eclipse.jetty - jetty-servlet - - 9.4.19.v20190610 - - - org.glassfish.jersey.inject - jersey-hk2 - ${jersey.version} - - - org.glassfish.jersey.media - jersey-media-json-processing - ${jersey.version} - - - org.glassfish.jersey.containers - jersey-container-servlet-core - ${jersey.version} - - - ch.qos.logback - logback-core - test - - - ch.qos.logback - logback-classic - test - - - javax.xml.bind - jaxb-api - ${jaxb.version} - - - com.sun.xml.bind - jaxb-impl - ${jaxb.version} - - - com.sun.xml.bind - jaxb-core - ${jaxb.version} - - - javax.activation - activation - ${javax.activation.version} - - - com.google.code.gson - gson - ${gson.version} - test - - - org.nd4j - nd4j-common-tests - ${project.version} - test - - - - - - testresources - - - nd4j-tests-cpu - - - org.nd4j - nd4j-native - ${project.version} - test - - - - - nd4j-tests-cuda - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - test - - - - - diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/SameDiffJsonModelServer.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/SameDiffJsonModelServer.java deleted file mode 100644 index 9bf48873b..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/SameDiffJsonModelServer.java +++ /dev/null @@ -1,306 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote; - -import lombok.*; -import lombok.extern.slf4j.Slf4j; -import org.eclipse.jetty.server.Server; -import org.eclipse.jetty.servlet.ServletContextHandler; -import org.nd4j.adapters.InputAdapter; -import org.nd4j.adapters.OutputAdapter; -import org.nd4j.autodiff.samediff.SameDiff; -import org.nd4j.common.base.Preconditions; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.remote.clients.serde.BinaryDeserializer; -import org.nd4j.remote.clients.serde.BinarySerializer; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; -import org.nd4j.adapters.InferenceAdapter; -import org.nd4j.remote.serving.ModelServingServlet; -import org.nd4j.remote.serving.SameDiffServlet; - -import java.util.List; - -/** - * This class provides JSON-powered model serving functionality for SameDiff graphs. - * Server url will be http://0.0.0.0:{port}>/v1/serving - * Server only accepts POST requests - * - * @param type of the input class, i.e. String - * @param type of the output class, i.e. Sentiment - * - * @author raver119@gmail.com - */ -@Slf4j -public class SameDiffJsonModelServer { - - - protected SameDiff sdModel; - protected final JsonSerializer serializer; - protected final JsonDeserializer deserializer; - protected final BinarySerializer binarySerializer; - protected final BinaryDeserializer binaryDeserializer; - protected final InferenceAdapter inferenceAdapter; - protected final int port; - - // this servlet will be used to serve models - protected ModelServingServlet servingServlet; - - // HTTP server instance - protected Server server; - - // for SameDiff only - protected String[] orderedInputNodes; - protected String[] orderedOutputNodes; - - protected SameDiffJsonModelServer(@NonNull InferenceAdapter inferenceAdapter, - JsonSerializer serializer, JsonDeserializer deserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer, - int port) { - Preconditions.checkArgument(port > 0 && port < 65535, "TCP port must be in range of 0..65535"); - Preconditions.checkArgument(serializer == null && binarySerializer == null || - serializer != null && binarySerializer == null || - serializer == null && binarySerializer != null, - "JSON and binary serializers/deserializers are mutually exclusive and mandatory."); - - this.binarySerializer = binarySerializer; - this.binaryDeserializer = binaryDeserializer; - this.inferenceAdapter = inferenceAdapter; - this.serializer = serializer; - this.deserializer = deserializer; - this.port = port; - } - - //@Builder - public SameDiffJsonModelServer(SameDiff sdModel, @NonNull InferenceAdapter inferenceAdapter, - JsonSerializer serializer, JsonDeserializer deserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer, - int port, String[] orderedInputNodes, @NonNull String[] orderedOutputNodes) { - this(inferenceAdapter, serializer, deserializer, binarySerializer, binaryDeserializer, port); - this.sdModel = sdModel; - this.orderedInputNodes = orderedInputNodes; - this.orderedOutputNodes = orderedOutputNodes; - - // TODO: both lists of nodes should be validated, to make sure nodes specified here exist in actual model - if (orderedInputNodes != null) { - // input nodes list might be null. strange but ok - } - - Preconditions.checkArgument(orderedOutputNodes != null && orderedOutputNodes.length > 0, "SameDiff serving requires at least 1 output node"); - } - - protected void start(int port, @NonNull ModelServingServlet servlet) throws Exception { - val context = new ServletContextHandler(ServletContextHandler.SESSIONS); - context.setContextPath("/"); - - server = new Server(port); - server.setHandler(context); - - val jerseyServlet = context.addServlet(org.glassfish.jersey.servlet.ServletContainer.class, "/*"); - jerseyServlet.setInitOrder(0); - jerseyServlet.setServlet(servlet); - - server.start(); - } - - public void start() throws Exception { - Preconditions.checkArgument(sdModel != null, "SameDiff model wasn't defined"); - - servingServlet = SameDiffServlet.builder() - .sdModel(sdModel) - .serializer(serializer) - .deserializer(deserializer) - .inferenceAdapter(inferenceAdapter) - .orderedInputNodes(orderedInputNodes) - .orderedOutputNodes(orderedOutputNodes) - .build(); - - start(port, servingServlet); - } - - public void join() throws InterruptedException { - Preconditions.checkArgument(server != null, "Model server wasn't started yet"); - - server.join(); - } - - public void stop() throws Exception { - //Preconditions.checkArgument(server != null, "Model server wasn't started yet"); - - server.stop(); - } - - - public static class Builder { - private SameDiff sameDiff; - private String[] orderedInputNodes; - private String[] orderedOutputNodes; - private InferenceAdapter inferenceAdapter; - private JsonSerializer serializer; - private JsonDeserializer deserializer; - private int port; - - private InputAdapter inputAdapter; - private OutputAdapter outputAdapter; - - public Builder() {} - - public Builder sdModel(@NonNull SameDiff sameDiff) { - this.sameDiff = sameDiff; - return this; - } - - /** - * This method defines InferenceAdapter implementation, which will be used to convert object of Input type to the set of INDArray(s), and for conversion of resulting INDArray(s) into object of Output type - * @param inferenceAdapter - * @return - */ - public Builder inferenceAdapter(InferenceAdapter inferenceAdapter) { - this.inferenceAdapter = inferenceAdapter; - return this; - } - - /** - * This method allows you to specify InputAdapter to be used for inference - * - * PLEASE NOTE: This method is optional, and will require OutputAdapter defined - * @param inputAdapter - * @return - */ - public Builder inputAdapter(@NonNull InputAdapter inputAdapter) { - this.inputAdapter = inputAdapter; - return this; - } - - /** - * This method allows you to specify OutputAdapter to be used for inference - * - * PLEASE NOTE: This method is optional, and will require InputAdapter defined - * @param outputAdapter - * @return - */ - public Builder outputAdapter(@NonNull OutputAdapter outputAdapter) { - this.outputAdapter = outputAdapter; - return this; - } - - /** - * This method defines JsonSerializer instance to be used to convert object of output type into JSON format, so it could be sent over the wire - * - * @param serializer - * @return - */ - public Builder outputSerializer(@NonNull JsonSerializer serializer) { - this.serializer = serializer; - return this; - } - - /** - * This method defines JsonDeserializer instance to be used to convert JSON passed through HTTP into actual object of input type, that will be fed into SameDiff model - * - * @param deserializer - * @return - */ - public Builder inputDeserializer(@NonNull JsonDeserializer deserializer) { - this.deserializer = deserializer; - return this; - } - - /** - * This method defines the order of placeholders to be filled with INDArrays provided by Deserializer - * - * @param args - * @return - */ - public Builder orderedInputNodes(String... args) { - orderedInputNodes = args; - return this; - } - - /** - * This method defines the order of placeholders to be filled with INDArrays provided by Deserializer - * - * @param args - * @return - */ - public Builder orderedInputNodes(@NonNull List args) { - orderedInputNodes = args.toArray(new String[args.size()]); - return this; - } - - /** - * This method defines list of graph nodes to be extracted after feed-forward pass and used as OutputAdapter input - * @param args - * @return - */ - public Builder orderedOutputNodes(String... args) { - Preconditions.checkArgument(args != null && args.length > 0, "OutputNodes should contain at least 1 element"); - orderedOutputNodes = args; - return this; - } - - /** - * This method defines list of graph nodes to be extracted after feed-forward pass and used as OutputAdapter input - * @param args - * @return - */ - public Builder orderedOutputNodes(@NonNull List args) { - Preconditions.checkArgument(args.size() > 0, "OutputNodes should contain at least 1 element"); - orderedOutputNodes = args.toArray(new String[args.size()]); - return this; - } - - /** - * This method allows to configure HTTP port used for serving - * - * PLEASE NOTE: port must be free and be in range regular TCP/IP ports range - * @param port - * @return - */ - public Builder port(int port) { - this.port = port; - return this; - } - - /** - * This method builds SameDiffJsonModelServer instance - * @return - */ - public SameDiffJsonModelServer build() { - if (inferenceAdapter == null) { - if (inputAdapter != null && outputAdapter != null) { - inferenceAdapter = new InferenceAdapter() { - @Override - public MultiDataSet apply(I input) { - return inputAdapter.apply(input); - } - - @Override - public O apply(INDArray... outputs) { - return outputAdapter.apply(outputs); - } - }; - } else - throw new IllegalArgumentException("Either InferenceAdapter or InputAdapter + OutputAdapter should be configured"); - } - return new SameDiffJsonModelServer(sameDiff, inferenceAdapter, serializer, deserializer, null, null, port, orderedInputNodes, orderedOutputNodes); - } - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/ModelServingServlet.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/ModelServingServlet.java deleted file mode 100644 index 2d9b3708f..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/ModelServingServlet.java +++ /dev/null @@ -1,32 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.serving; - -import javax.servlet.Servlet; - -/** - * This interface describes Servlet interface extension, suited for ND4J/DL4J model serving - * @param - * @param - * - * @author raver119@gmail.com - */ -public interface ModelServingServlet extends Servlet { - // -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/SameDiffServlet.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/SameDiffServlet.java deleted file mode 100644 index fdad78030..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/SameDiffServlet.java +++ /dev/null @@ -1,238 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.serving; - -import lombok.*; -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.lang3.StringUtils; -import org.nd4j.autodiff.samediff.SameDiff; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.remote.clients.serde.BinaryDeserializer; -import org.nd4j.remote.clients.serde.BinarySerializer; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; -import org.nd4j.adapters.InferenceAdapter; - -import javax.servlet.*; -import javax.servlet.http.HttpServletRequest; -import javax.servlet.http.HttpServletResponse; -import javax.ws.rs.HttpMethod; -import java.io.BufferedReader; -import java.io.IOException; -import java.io.InputStreamReader; -import java.util.LinkedHashMap; - -import static javax.ws.rs.core.MediaType.APPLICATION_JSON; -import static javax.ws.rs.core.MediaType.APPLICATION_OCTET_STREAM; - -/** - * This servlet provides SameDiff model serving capabilities - * - * @param - * @param - * - * @author raver119@gmail.com - */ -@NoArgsConstructor -@AllArgsConstructor -@Slf4j -@Builder -public class SameDiffServlet implements ModelServingServlet { - - protected static final String typeJson = APPLICATION_JSON; - protected static final String typeBinary = APPLICATION_OCTET_STREAM; - - protected SameDiff sdModel; - protected JsonSerializer serializer; - protected JsonDeserializer deserializer; - protected BinarySerializer binarySerializer; - protected BinaryDeserializer binaryDeserializer; - protected InferenceAdapter inferenceAdapter; - - protected String[] orderedInputNodes; - protected String[] orderedOutputNodes; - - protected final static String SERVING_ENDPOINT = "/v1/serving"; - protected final static String LISTING_ENDPOINT = "/v1"; - protected final static int PAYLOAD_SIZE_LIMIT = 10 * 1024 * 1024; // TODO: should be customizable - - protected SameDiffServlet(@NonNull InferenceAdapter inferenceAdapter, JsonSerializer serializer, JsonDeserializer deserializer){ - this.serializer = serializer; - this.deserializer = deserializer; - this.inferenceAdapter = inferenceAdapter; - } - - protected SameDiffServlet(@NonNull InferenceAdapter inferenceAdapter, - BinarySerializer serializer, BinaryDeserializer deserializer){ - this.binarySerializer = serializer; - this.binaryDeserializer = deserializer; - this.inferenceAdapter = inferenceAdapter; - } - - protected SameDiffServlet(@NonNull InferenceAdapter inferenceAdapter, - JsonSerializer jsonSerializer, JsonDeserializer jsonDeserializer, - BinarySerializer binarySerializer, BinaryDeserializer binaryDeserializer){ - - this.serializer = jsonSerializer; - this.deserializer = jsonDeserializer; - this.binarySerializer = binarySerializer; - this.binaryDeserializer = binaryDeserializer; - this.inferenceAdapter = inferenceAdapter; - - if (serializer != null && binarySerializer != null || serializer == null && binarySerializer == null) - throw new IllegalStateException("Binary and JSON serializers/deserializers are mutually exclusive and mandatory."); - } - - - @Override - public void init(ServletConfig servletConfig) throws ServletException { - // - } - - @Override - public ServletConfig getServletConfig() { - return null; - } - - @Override - public void service(ServletRequest servletRequest, ServletResponse servletResponse) throws ServletException, IOException { - // we'll parse request here, and do model serving - val httpRequest = (HttpServletRequest) servletRequest; - val httpResponse = (HttpServletResponse) servletResponse; - - if (httpRequest.getMethod().equals(HttpMethod.GET)) { - doGet(httpRequest, httpResponse); - } - else if (httpRequest.getMethod().equals(HttpMethod.POST)) { - doPost(httpRequest, httpResponse); - } - - } - - protected void sendError(String uri, HttpServletResponse response) throws IOException { - val msg = "Requested endpoint [" + uri + "] not found"; - response.setStatus(404, msg); - response.sendError(404, msg); - } - - protected void sendBadContentType(String actualContentType, HttpServletResponse response) throws IOException { - val msg = "Content type [" + actualContentType + "] not supported"; - response.setStatus(415, msg); - response.sendError(415, msg); - } - - protected boolean validateRequest(HttpServletRequest request, HttpServletResponse response) - throws IOException{ - val contentType = request.getContentType(); - if (!StringUtils.equals(contentType, typeJson)) { - sendBadContentType(contentType, response); - int contentLength = request.getContentLength(); - if (contentLength > PAYLOAD_SIZE_LIMIT) { - response.sendError(500, "Payload size limit violated!"); - } - return false; - } - return true; - } - - protected void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException { - val processor = new ServingProcessor(); - String processorReturned = ""; - String path = request.getPathInfo(); - if (path.equals(LISTING_ENDPOINT)) { - val contentType = request.getContentType(); - if (!StringUtils.equals(contentType, typeJson)) { - sendBadContentType(contentType, response); - } - processorReturned = processor.listEndpoints(); - } - else { - sendError(request.getRequestURI(), response); - } - try { - val out = response.getWriter(); - out.write(processorReturned); - } catch (IOException e) { - log.error(e.getMessage()); - } - } - - protected void doPost(HttpServletRequest request, HttpServletResponse response) throws IOException { - val processor = new ServingProcessor(); - String processorReturned = ""; - String path = request.getPathInfo(); - if (path.equals(SERVING_ENDPOINT)) { - val contentType = request.getContentType(); - /*Preconditions.checkArgument(StringUtils.equals(contentType, typeJson), - "Content type is " + contentType);*/ - if (validateRequest(request,response)) { - val stream = request.getInputStream(); - val bufferedReader = new BufferedReader(new InputStreamReader(stream)); - char[] charBuffer = new char[128]; - int bytesRead = -1; - val buffer = new StringBuilder(); - while ((bytesRead = bufferedReader.read(charBuffer)) > 0) { - buffer.append(charBuffer, 0, bytesRead); - } - val requestString = buffer.toString(); - - val mds = inferenceAdapter.apply(deserializer.deserialize(requestString)); - val map = new LinkedHashMap(); - - // optionally define placeholders with names provided in server constructor - if (orderedInputNodes != null && orderedInputNodes.length > 0) { - int cnt = 0; - for (val n : orderedInputNodes) - map.put(n, mds.getFeatures(cnt++)); - } - - val output = sdModel.output(map, orderedOutputNodes); - val arrays = new INDArray[output.size()]; - - // now we need to get ordered output arrays, as specified in server constructor - int cnt = 0; - for (val n : orderedOutputNodes) - arrays[cnt++] = output.get(n); - - // process result - val result = inferenceAdapter.apply(arrays); - processorReturned = serializer.serialize(result); - } - } else { - // we return error otherwise - sendError(request.getRequestURI(), response); - } - try { - val out = response.getWriter(); - out.write(processorReturned); - } catch (IOException e) { - log.error(e.getMessage()); - } - } - - @Override - public String getServletInfo() { - return null; - } - - @Override - public void destroy() { - // - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/ServingProcessor.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/ServingProcessor.java deleted file mode 100644 index a0fe30965..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/main/java/org/nd4j/remote/serving/ServingProcessor.java +++ /dev/null @@ -1,32 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.serving; - -public class ServingProcessor { - - public String listEndpoints() { - String retVal = "/v1/ \n/v1/serving/"; - return retVal; - } - - public String processModel(String body) { - String response = null; //"Not implemented"; - return response; - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/SameDiffJsonModelServerTest.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/SameDiffJsonModelServerTest.java deleted file mode 100644 index a95521723..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/SameDiffJsonModelServerTest.java +++ /dev/null @@ -1,274 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote; - -import lombok.extern.slf4j.Slf4j; -import lombok.val; -import org.junit.Test; -import org.nd4j.common.tests.BaseND4JTest; -import org.nd4j.autodiff.samediff.SameDiff; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.remote.clients.JsonRemoteInference; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.helpers.House; -import org.nd4j.remote.helpers.HouseToPredictedPriceAdapter; -import org.nd4j.remote.helpers.PredictedPrice; -import org.nd4j.remote.clients.serde.impl.FloatArraySerde; - -import java.io.IOException; -import java.util.concurrent.ExecutionException; -import java.util.concurrent.Future; - -import static org.junit.Assert.*; - -@Slf4j -public class SameDiffJsonModelServerTest extends BaseND4JTest { - - @Test - public void basicServingTest_1() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val server = new SameDiffJsonModelServer.Builder() - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .orderedInputNodes(new String[]{"input"}) - .orderedOutputNodes(new String[]{"total"}) - .sdModel(sd) - .port(18080) - .build(); - - server.start(); - - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new PredictedPrice.PredictedPriceDeserializer()) - .endpointAddress("http://localhost:18080/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - // warmup - PredictedPrice price = client.predict(house); - - val timeStart = System.currentTimeMillis(); - price = client.predict(house); - val timeStop = System.currentTimeMillis(); - - log.info("Time spent: {} ms", timeStop - timeStart); - - assertNotNull(price); - assertEquals((float) district + 1.0f, price.getPrice(), 1e-5); - - server.stop(); - } - - @Test - public void testDeserialization_1() { - String request = "{\"bedrooms\":3,\"area\":100,\"district\":2,\"bathrooms\":2}"; - val deserializer = new House.HouseDeserializer(); - val result = deserializer.deserialize(request); - assertEquals(2, result.getDistrict()); - assertEquals(100, result.getArea()); - assertEquals(2, result.getBathrooms()); - assertEquals(3, result.getBedrooms()); - - } - - @Test - public void testDeserialization_2() { - String request = "{\"price\":1}"; - val deserializer = new PredictedPrice.PredictedPriceDeserializer(); - val result = deserializer.deserialize(request); - assertEquals(1.0, result.getPrice(), 1e-4); - } - - @Test - public void testDeserialization_3() { - float[] data = {0.0f, 0.1f, 0.2f}; - val serialized = new FloatArraySerde().serialize(data); - val deserialized = new FloatArraySerde().deserialize(serialized); - assertArrayEquals(data, deserialized, 1e-5f); - } - - @Test(expected = NullPointerException.class) - public void negativeServingTest_1() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val server = new SameDiffJsonModelServer.Builder() - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(null) - .sdModel(sd) - .port(18080) - .build(); - } - - @Test(expected = NullPointerException.class) - public void negativeServingTest_2() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val server = new SameDiffJsonModelServer.Builder() - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .sdModel(sd) - .port(18080) - .build(); - - } - - @Test(expected = IOException.class) - public void negativeServingTest_3() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val server = new SameDiffJsonModelServer.Builder() - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .orderedInputNodes(new String[]{"input"}) - .orderedOutputNodes(new String[]{"total"}) - .sdModel(sd) - .port(18080) - .build(); - - server.start(); - - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new JsonDeserializer() { - @Override - public PredictedPrice deserialize(String json) { - return null; - } - }) - .endpointAddress("http://localhost:18080/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - // warmup - PredictedPrice price = client.predict(house); - - server.stop(); - } - - @Test - public void asyncServingTest() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val server = new SameDiffJsonModelServer.Builder() - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .orderedInputNodes(new String[]{"input"}) - .orderedOutputNodes(new String[]{"total"}) - .sdModel(sd) - .port(18080) - .build(); - - server.start(); - - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new PredictedPrice.PredictedPriceDeserializer()) - .endpointAddress("http://localhost:18080/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - val timeStart = System.currentTimeMillis(); - Future price = client.predictAsync(house); - assertNotNull(price); - assertEquals((float) district + 1.0f, price.get().getPrice(), 1e-5); - val timeStop = System.currentTimeMillis(); - - log.info("Time spent: {} ms", timeStop - timeStart); - - - server.stop(); - } - - @Test - public void negativeAsyncTest() throws Exception { - val sd = SameDiff.create(); - val sdVariable = sd.placeHolder("input", DataType.INT, 4); - val result = sdVariable.add(1.0); - val total = result.mean("total", Integer.MAX_VALUE); - - val server = new SameDiffJsonModelServer.Builder() - .outputSerializer(new PredictedPrice.PredictedPriceSerializer()) - .inputDeserializer(new House.HouseDeserializer()) - .inferenceAdapter(new HouseToPredictedPriceAdapter()) - .orderedInputNodes(new String[]{"input"}) - .orderedOutputNodes(new String[]{"total"}) - .sdModel(sd) - .port(18080) - .build(); - - server.start(); - - // Fake deserializer to test failure - val client = JsonRemoteInference.builder() - .inputSerializer(new House.HouseSerializer()) - .outputDeserializer(new JsonDeserializer() { - @Override - public PredictedPrice deserialize(String json) { - return null; - } - }) - .endpointAddress("http://localhost:18080/v1/serving") - .build(); - - int district = 2; - House house = House.builder().area(100).bathrooms(2).bedrooms(3).district(district).build(); - - val timeStart = System.currentTimeMillis(); - try { - Future price = client.predictAsync(house); - assertNotNull(price); - assertEquals((float) district + 1.0f, price.get().getPrice(), 1e-5); - val timeStop = System.currentTimeMillis(); - - log.info("Time spent: {} ms", timeStop - timeStart); - } catch (ExecutionException e) { - assertTrue(e.getMessage().contains("Deserialization failed")); - } - - server.stop(); - } - -} \ No newline at end of file diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/SameDiffServletTest.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/SameDiffServletTest.java deleted file mode 100644 index 8f130e1d8..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/SameDiffServletTest.java +++ /dev/null @@ -1,135 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote; - -import lombok.val; -import org.apache.http.client.methods.HttpGet; -import org.apache.http.client.methods.HttpPost; -import org.apache.http.impl.client.HttpClientBuilder; -import org.junit.After; -import org.junit.Before; -import org.junit.Test; -import org.nd4j.common.tests.BaseND4JTest; -import org.nd4j.autodiff.samediff.SameDiff; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; -import org.nd4j.adapters.InferenceAdapter; - -import java.io.IOException; - -import static org.junit.Assert.assertEquals; - -public class SameDiffServletTest extends BaseND4JTest { - - private SameDiffJsonModelServer server; - - @Before - public void setUp() throws Exception { - server = new SameDiffJsonModelServer.Builder() - .sdModel(SameDiff.create()) - .port(8080) - .inferenceAdapter(new InferenceAdapter() { - @Override - public MultiDataSet apply(String input) { - return null; - } - - @Override - public String apply(INDArray... nnOutput) { - return null; - } - }) - .outputSerializer(new JsonSerializer() { - @Override - public String serialize(String o) { - return ""; - } - }) - .inputDeserializer(new JsonDeserializer() { - @Override - public String deserialize(String json) { - return ""; - } - }) - .orderedOutputNodes(new String[]{"output"}) - .build(); - - server.start(); - //server.join(); - } - - @After - public void tearDown() throws Exception { - server.stop(); - } - - @Test - public void getEndpoints() throws IOException { - val request = new HttpGet( "http://localhost:8080/v1" ); - request.setHeader("Content-type", "application/json"); - - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(200, response.getStatusLine().getStatusCode()); - } - - @Test - public void testContentTypeGet() throws IOException { - val request = new HttpGet( "http://localhost:8080/v1" ); - request.setHeader("Content-type", "text/plain"); - - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(415, response.getStatusLine().getStatusCode()); - } - - @Test - public void testContentTypePost() throws Exception { - val request = new HttpPost("http://localhost:8080/v1/serving"); - request.setHeader("Content-type", "text/plain"); - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(415, response.getStatusLine().getStatusCode()); - } - - @Test - public void postForServing() throws Exception { - val request = new HttpPost("http://localhost:8080/v1/serving"); - request.setHeader("Content-type", "application/json"); - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(500, response.getStatusLine().getStatusCode()); - } - - @Test - public void testNotFoundPost() throws Exception { - val request = new HttpPost("http://localhost:8080/v1/serving/some"); - request.setHeader("Content-type", "application/json"); - val response = HttpClientBuilder.create().build().execute( request ); - assertEquals(404, response.getStatusLine().getStatusCode()); - } - - @Test - public void testNotFoundGet() throws Exception { - val requestGet = new HttpGet( "http://localhost:8080/v1/not_found" ); - requestGet.setHeader("Content-type", "application/json"); - - val responseGet = HttpClientBuilder.create().build().execute( requestGet ); - assertEquals(404, responseGet.getStatusLine().getStatusCode()); - } - -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/House.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/House.java deleted file mode 100644 index 229d0c2d4..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/House.java +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.helpers; - -import com.google.gson.Gson; -import lombok.*; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; - -@Data -@Builder -@AllArgsConstructor -@NoArgsConstructor -public class House { - private int district; - private int bedrooms; - private int bathrooms; - private int area; - - - public static class HouseSerializer implements JsonSerializer { - @Override - public String serialize(@NonNull House o) { - return new Gson().toJson(o); - } - } - - public static class HouseDeserializer implements JsonDeserializer { - @Override - public House deserialize(@NonNull String json) { - return new Gson().fromJson(json, House.class); - } - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/HouseToPredictedPriceAdapter.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/HouseToPredictedPriceAdapter.java deleted file mode 100644 index 4b3f14826..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/HouseToPredictedPriceAdapter.java +++ /dev/null @@ -1,42 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.helpers; - -import lombok.NonNull; -import lombok.extern.slf4j.Slf4j; -import org.nd4j.linalg.api.buffer.DataType; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.dataset.MultiDataSet; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.adapters.InferenceAdapter; - -@Slf4j -public class HouseToPredictedPriceAdapter implements InferenceAdapter { - - @Override - public MultiDataSet apply(@NonNull House input) { - // we just create vector array with shape[4] and assign it's value to the district value - return new MultiDataSet(Nd4j.create(DataType.FLOAT, 4).assign(input.getDistrict()), null); - } - - @Override - public PredictedPrice apply(INDArray... nnOutput) { - return new PredictedPrice(nnOutput[0].getFloat(0)); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/PredictedPrice.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/PredictedPrice.java deleted file mode 100644 index aa60ab79d..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/helpers/PredictedPrice.java +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.helpers; - -import com.google.gson.Gson; -import lombok.AllArgsConstructor; -import lombok.Data; -import lombok.NoArgsConstructor; -import lombok.NonNull; -import org.nd4j.remote.clients.serde.JsonDeserializer; -import org.nd4j.remote.clients.serde.JsonSerializer; - -@Data -@AllArgsConstructor -@NoArgsConstructor -public class PredictedPrice { - private float price; - - public static class PredictedPriceSerializer implements JsonSerializer { - @Override - public String serialize(@NonNull PredictedPrice o) { - return new Gson().toJson(o); - } - } - - public static class PredictedPriceDeserializer implements JsonDeserializer { - @Override - public PredictedPrice deserialize(@NonNull String json) { - return new Gson().fromJson(json, PredictedPrice.class); - } - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/serde/BasicSerdeTests.java b/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/serde/BasicSerdeTests.java deleted file mode 100644 index c2ce409fc..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/java/org/nd4j/remote/serde/BasicSerdeTests.java +++ /dev/null @@ -1,109 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4j.remote.serde; - -import lombok.val; -import org.junit.Test; -import org.nd4j.common.tests.BaseND4JTest; -import org.nd4j.remote.clients.serde.impl.*; - -import static org.junit.Assert.assertArrayEquals; -import static org.junit.Assert.assertEquals; - -public class BasicSerdeTests extends BaseND4JTest { - private final static DoubleArraySerde doubleArraySerde = new DoubleArraySerde(); - private final static FloatArraySerde floatArraySerde = new FloatArraySerde(); - private final static StringSerde stringSerde = new StringSerde(); - private final static IntegerSerde integerSerde = new IntegerSerde(); - private final static FloatSerde floatSerde = new FloatSerde(); - private final static DoubleSerde doubleSerde = new DoubleSerde(); - private final static BooleanSerde booleanSerde = new BooleanSerde(); - - @Test - public void testStringSerde_1() { - val jvmString = "String with { strange } elements"; - - val serialized = stringSerde.serialize(jvmString); - val deserialized = stringSerde.deserialize(serialized); - - assertEquals(jvmString, deserialized); - } - - @Test - public void testFloatArraySerDe_1() { - val jvmArray = new float[] {1.0f, 2.0f, 3.0f, 4.0f, 5.0f}; - - val serialized = floatArraySerde.serialize(jvmArray); - val deserialized = floatArraySerde.deserialize(serialized); - - assertArrayEquals(jvmArray, deserialized, 1e-5f); - } - - @Test - public void testDoubleArraySerDe_1() { - val jvmArray = new double[] {1.0, 2.0, 3.0, 4.0, 5.0}; - - val serialized = doubleArraySerde.serialize(jvmArray); - val deserialized = doubleArraySerde.deserialize(serialized); - - assertArrayEquals(jvmArray, deserialized, 1e-5); - } - - @Test - public void testFloatSerde_1() { - val f = 119.f; - - val serialized = floatSerde.serialize(f); - val deserialized = floatSerde.deserialize(serialized); - - assertEquals(f, deserialized, 1e-5f); - } - - @Test - public void testDoubleSerde_1() { - val d = 119.; - - val serialized = doubleSerde.serialize(d); - val deserialized = doubleSerde.deserialize(serialized); - - assertEquals(d, deserialized, 1e-5); - } - - @Test - public void testIntegerSerde_1() { - val f = 119; - - val serialized = integerSerde.serialize(f); - val deserialized = integerSerde.deserialize(serialized); - - - assertEquals(f, deserialized.intValue()); - } - - @Test - public void testBooleanSerde_1() { - val f = true; - - val serialized = booleanSerde.serialize(f); - val deserialized = booleanSerde.deserialize(serialized); - - - assertEquals(f, deserialized); - } -} diff --git a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/resources/logback.xml b/contrib/attic/nd4j-remote/nd4j-json-server/src/test/resources/logback.xml deleted file mode 100644 index 27e08c0d5..000000000 --- a/contrib/attic/nd4j-remote/nd4j-json-server/src/test/resources/logback.xml +++ /dev/null @@ -1,52 +0,0 @@ - - - - - - - - logs/application.log - - %logger{15} - %message%n%xException{5} - - - - - - - %logger{15} - %message%n%xException{5} - - - - - - - - - - - - - - - - - \ No newline at end of file diff --git a/contrib/attic/nd4j-remote/pom.xml b/contrib/attic/nd4j-remote/pom.xml deleted file mode 100644 index ce0e78054..000000000 --- a/contrib/attic/nd4j-remote/pom.xml +++ /dev/null @@ -1,58 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j - 1.0.0-SNAPSHOT - - - nd4j-remote - pom - - nd4j-remote - - - nd4j-json-client - nd4j-grpc-client - nd4j-json-server - - - - - junit - junit - test - - - - - - testresources - - - diff --git a/contrib/attic/nd4j-uberjar/pom.xml b/contrib/attic/nd4j-uberjar/pom.xml deleted file mode 100644 index 9c1cd0e38..000000000 --- a/contrib/attic/nd4j-uberjar/pom.xml +++ /dev/null @@ -1,304 +0,0 @@ - - - - - - 4.0.0 - - - org.nd4j - nd4j - 1.0.0-SNAPSHOT - - - nd4j-uberjar - - - - - org.apache.maven.plugins - maven-shade-plugin - ${maven-shade-plugin.version} - - true - - - *:* - - org/datanucleus/** - META-INF/*.SF - META-INF/*.DSA - META-INF/*.RSA - - - - - - void - void - - - - - - - - false - - - - none - - shade - - - - - - org.apache.maven.plugins - maven-enforcer-plugin - - - package - enforce-choice-of-nd4j-backend - - enforce - - - ${skipBackendChoice} - - - native,cuda,cuda-snapshots,native-snapshots - false - - - true - - - - - - org.apache.maven.plugins - maven-jar-plugin - - true - - - - empty-javadoc-jar - package - - jar - - - javadoc - ${basedir}/javadoc - - - - empty-sources-jar - package - - jar - - - sources - ${basedir}/src - - - - - - - - - - - default - - true - - - true - - - - uberjar - - - - org.apache.maven.plugins - maven-shade-plugin - - - package - - - - - com.lewisd - lint-maven-plugin - - false - - - - - - - org.nd4j - jackson - ${project.version} - - - org.nd4j - nd4j-jdbc-api - ${project.version} - - - org.nd4j - nd4j-jdbc-mysql - ${project.version} - - - org.nd4j - nd4j-aeron - ${project.version} - - - org.nd4j - nd4j-kryo_2.11 - ${project.version} - - - org.nd4j - nd4j-common - ${project.version} - - - org.nd4j - nd4j-api - - - org.nd4j - nd4j-parameter-server - ${project.version} - - - org.nd4j - nd4j-parameter-server-client - ${project.version} - - - org.nd4j - nd4j-parameter-server-model - ${project.version} - - - org.nd4j - nd4j-parameter-server-status_2.11 - ${project.version} - - - org.nd4j - nd4j-parameter-server-rocksdb-storage - ${project.version} - - - org.nd4j - nd4j-parameter-server-node_2.11 - ${project.version} - - - - false - - - - native-snapshots - - - org.nd4j - nd4j-native - ${project.version} - - - org.nd4j - nd4j-native-api - ${project.version} - - - - - cuda-snapshots - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - - - - - native - - - org.nd4j - nd4j-native - ${project.version} - - - org.nd4j - nd4j-native-platform - ${project.version} - - - org.nd4j - nd4j-native-api - ${project.version} - - - - - cuda - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - - - org.nd4j - nd4j-cuda-11.0-platform - ${project.version} - - - - - testresources - - - diff --git a/contrib/attic/nd4s/.gitignore b/contrib/attic/nd4s/.gitignore deleted file mode 100644 index 0a11795f9..000000000 --- a/contrib/attic/nd4s/.gitignore +++ /dev/null @@ -1,21 +0,0 @@ -*.class -*.log -*.iml -.DS_Store - -# sbt specific -.cache -.history -.lib/ -dist/* -target/ -lib_managed/ -src_managed/ -project/boot/ -project/plugins/project/ - -# Scala-IDE specific -.scala_dependencies -.worksheet -.idea/ -.idea_modules/ diff --git a/contrib/attic/nd4s/.scalafmt.conf b/contrib/attic/nd4s/.scalafmt.conf deleted file mode 100644 index 00e9ced59..000000000 --- a/contrib/attic/nd4s/.scalafmt.conf +++ /dev/null @@ -1,9 +0,0 @@ -align = some -danglingParentheses = true -indentOperator = spray -maxColumn = 120 -lineEndings = unix -project.excludeFilters = [".*\\.sbt"] -rewrite.rules = [AsciiSortImports, RedundantBraces, RedundantParens] -spaces.inImportCurlyBraces = true -unindentTopLevelOperators = true diff --git a/contrib/attic/nd4s/README.md b/contrib/attic/nd4s/README.md deleted file mode 100644 index f862b2e69..000000000 --- a/contrib/attic/nd4s/README.md +++ /dev/null @@ -1,93 +0,0 @@ -# ND4S: Scala bindings for ND4J - -[![Join the chat at https://gitter.im/deeplearning4j/deeplearning4j](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/deeplearning4j/deeplearning4j?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) - -ND4S is open-source Scala bindings for [ND4J](https://github.com/eclipse/deeplearning4j/tree/master/nd4j). Released under an Apache 2.0 license. - -# Main Features -* NDArray manipulation syntax sugar with safer type. -* NDArray slicing syntax, similar with NumPy. - -# Installation - -## Install via Maven -ND4S is already included in official Maven repositories. - -With IntelliJ, incorporation of ND4S is easy: just create a new Scala project, go to "Project Settings"/Libraries, add "From Maven...", and search for nd4s. - -As an alternative, one may simply add the line below to `build.sbt` and re-build project. - -```scala -val nd4jVersion = "1.0.0-alpha" - -libraryDependencies += "org.nd4j" % "nd4j-native-platform" % nd4jVersion -libraryDependencies += "org.nd4j" %% "nd4s" % nd4jVersion -``` - -One may want to check our [maven repository page](https://mvnrepository.com/artifact/org.nd4j/nd4s_2.11) and replace `1.0.0-alpha` with the latest version. - -No need for git-cloning & compiling! - -## Clone from the GitHub Repo -ND4S is actively developed. You can clone the repository, compile it, and reference it in your project. - -Clone the repository: - -``` -$ git clone https://github.com/eclipse/deeplearning4j.git -``` - -Compile the project: - -``` -$ cd nd4s -$ sbt +publish-local -``` - -## Try ND4S in REPL -The easiest way to play ND4S around is cloning this repository and run the following command. - -``` -$ cd nd4s -$ sbt test:console -``` - -It starts REPL with importing `org.nd4s.Implicits._` and `org.nd4j.linalg.factory.Nd4j` automatically. It uses jblas backend at default. - -```scala -scala> val arr = (1 to 9).asNDArray(3,3) -arr: org.nd4j.linalg.api.ndarray.INDArray = -[[1.00,2.00,3.00] - [4.00,5.00,6.00] - [7.00,8.00,9.00]] - -scala> val sub = arr(0->2,1->3) -sub: org.nd4j.linalg.api.ndarray.INDArray = -[[2.00,3.00] - [5.00,6.00]] -``` - -# CheatSheet(WIP) - -| ND4S syntax | Equivalent NumPy syntax | Result | -|--------------------------------------------|---------------------------------------------|----------------------------------------------------------------| -| Array(Array(1,2,3),Array(4,5,6)).toNDArray | np.array([[1, 2 , 3], [4, 5, 6]]) | [[1.0, 2.0, 3.0] [4.0, 5.0, 6.0]] | -| val arr = (1 to 9).asNDArray(3,3) | arr = np.arange(1,10).reshape(3,3) | [[1.0, 2.0, 3.0] [4.0, 5.0, 6.0] ,[7.0, 8.0, 9.0]] | -| arr(0,0) | arr[0,0] | 1.0 | -| arr(0,->) | arr[0,:] | [1.0, 2.0, 3.0] | -| arr(--->) | arr[...] | [[1.0, 2.0, 3.0] [4.0, 5.0, 6.0] ,[7.0, 8.0, 9.0]] | -| arr(0 -> 3 by 2, ->) | arr[0:3:2,:] | [[1.0, 2.0, 3.0] [7.0, 8.0, 9.0]] | -| arr(0 to 2 by 2, ->) | arr[0:3:2,:] | [[1.0, 2.0, 3.0] [7.0, 8.0, 9.0]] | -| arr.filter(_ > 3) | np.where(arr > 3, arr, 0) | [[0.0, 0.0, 0.0] [4.0, 5.0, 6.0] ,[7.0, 8.0, 9.0]] | -| arr.map(_ % 3) | | [[1.0, 2.0, 0.0] [1.0, 2.0, 0.0] ,[1.0, 2.0, 0.0]] | -| arr.filterBit(_ < 4) | | [[1.0, 1.0, 1.0] [0.0, 0.0, 0.0] ,[0.0, 0.0, 0.0]] | -| arr + arr | arr + arr | [[2.0, 4.0, 6.0] [8.0, 10.0, 12.0] ,[14.0, 16.0, 18.0]] | -| arr * arr | arr * arr | [[1.0, 4.0, 9.0] [16.0, 25.0, 36.0] ,[49.0, 64.0, 81.0]] | -| arr dot arr | np.dot(arr, arr) | [[30.0, 36.0, 42.0] [66.0, 81.0, 96.0] ,[102.0, 126.0, 150.0]] | -| arr.sumT | np.sum(arr) | 45.0 //returns Double value | -| val comp = Array(1 + i, 1 + 2 * i).toNDArray | comp = np.array([1 + 1j, 1 + 2j]) | [1.0 + 1.0i ,1.0 + 2.0i] | -| comp.sumT | np.sum(comp) | 2.0 + 3.0i //returns IComplexNumber value | -| for(row <- arr.rowP if row.get(0) > 1) yield row*2 | | [[8.00,10.00,12.00] [14.00,16.00,18.00]] | -| val tensor = (1 to 8).asNDArray(2,2,2) | tensor = np.arange(1,9).reshape(2,2,2) | [[[1.00,2.00] [3.00,4.00]] [[5.00,6.00] [7.00,8.00]]] | -| for(slice <- tensor.sliceP if slice.get(0) > 1) yield slice*2 | |[[[10.00,12.00][14.00,16.00]]] | -|arr(0 -> 3 by 2, ->) = 0 | | [[0.00,0.00,0.00] [4.00,5.00,6.00] [0.00,0.00,0.00]] | diff --git a/contrib/attic/nd4s/build.sbt b/contrib/attic/nd4s/build.sbt deleted file mode 100644 index d523b754e..000000000 --- a/contrib/attic/nd4s/build.sbt +++ /dev/null @@ -1,100 +0,0 @@ -lazy val currentVersion = SettingKey[String]("currentVersion") -lazy val nd4jVersion = SettingKey[String]("nd4jVersion") -lazy val publishSomeThing = sys.props.getOrElse("repoType", default = "local").toLowerCase match { - case repoType if repoType.contains("nexus") => publishNexus - case repoType if repoType.contains("bintray") => publishBintray - case repoType if repoType.contains("sonatype") => publishSonatype - case _ => publishLocalLocal -} - -val nexusStagingRepoId = sys.props.getOrElse("stageRepoId", default = "deploy/maven2") -lazy val releaseRepositoryId = sys.props.getOrElse("stageRepoId", default = "deploy/maven2") match { - case stageRepoId if stageRepoId.equals("") => "deploy/maven2" - case stageRepoId if stageRepoId.equals("deploy/maven2") => "deploy/maven2" - case _ => "deployByRepositoryId/" + nexusStagingRepoId -} - -resolvers in ThisBuild ++= Seq( - Resolver.sonatypeRepo("snapshots") -) - -cleanFiles += baseDirectory.value / "lib" -val mvnInstall = Seq("mvn", "install", "-q", "-f", "sbt-pom.xml") -val operatingSystem = sys.props("os.name").toLowerCase.substring(0, 3) -update := { - operatingSystem match { - case "win" => { Seq("cmd", "/C") ++ mvnInstall !; update.value } - case _ => { mvnInstall !; update.value } - } -} - -lazy val commonSettings = Seq( - scalaVersion := "2.11.8", - crossScalaVersions := Seq("2.10.6", "2.11.8"), - name := "nd4s", - version := sys.props.getOrElse("currentVersion", default = "1.0.0-SNAPSHOT"), - organization := "org.nd4j", - resolvers += Resolver.mavenLocal, - resolvers in ThisBuild ++= Seq(Opts.resolver.sonatypeSnapshots), - nd4jVersion := sys.props.getOrElse("nd4jVersion", default = "1.0.0-SNAPSHOT"), - libraryDependencies ++= Seq( -// "com.nativelibs4java" %% "scalaxy-loops" % "0.3.4", -// "org.nd4j" % "nd4j-api" % nd4jVersion.value, -// "org.nd4j" % "nd4j-native-platform" % nd4jVersion.value % Test, - "org.scalatest" %% "scalatest" % "2.2.6" % Test, - "ch.qos.logback" % "logback-classic" % "1.2.1" % Test, - "org.scalacheck" %% "scalacheck" % "1.12.5" % Test, - "org.scalanlp" %% "breeze" % "0.12" % Test, - "com.github.julien-truffaut" %% "monocle-core" % "1.2.0" % Test - ), - scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature", "-language:implicitConversions", "-language:higherKinds", "-language:postfixOps"), - publishMavenStyle := true, - publishArtifact in Test := false, - pomIncludeRepository := { _ => false }, - useGpg := true, - pgpPassphrase := Some(Array()), - credentials += Credentials(Path.userHome / ".ivy2" / ".credentials"), - releasePublishArtifactsAction := com.typesafe.sbt.pgp.PgpKeys.publishSigned.value, - releaseCrossBuild := true, - initialCommands in console := "import org.nd4j.linalg.factory.Nd4j; import org.nd4s.Implicits._" -) - -lazy val publishNexus = Seq( - publishTo := { - val nexus = "https://packages.konduit.ai/" - if (isSnapshot.value) - Some("snapshots" at nexus + "content/repositories/maven-snapshots") - else - Some("releases" at nexus + "service/local/staging/" + releaseRepositoryId) - } -) - -lazy val publishBintray = Seq( - publishTo := { - val jfrog = "https://oss.jfrog.org/artifactory/" - if (isSnapshot.value) - Some("snapshots" at jfrog + "oss-snapshot-local") - else - Some("releases" at jfrog + "oss-release-local") - } -) - -lazy val publishSonatype = Seq( - publishTo := { - val nexus = "https://oss.sonatype.org/" - if (isSnapshot.value) - Some("snapshots" at nexus + "content/repositories/snapshots") - else - Some("releases" at nexus + "service/local/staging/" + releaseRepositoryId) - } -) - -lazy val publishLocalLocal = Seq( - publish := {}, - publishLocal := {} -) - -lazy val root = (project in file(".")).settings( - commonSettings, - publishSomeThing -) diff --git a/contrib/attic/nd4s/pom.xml b/contrib/attic/nd4s/pom.xml deleted file mode 100644 index 8c9d5806b..000000000 --- a/contrib/attic/nd4s/pom.xml +++ /dev/null @@ -1,241 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j - 1.0.0-SNAPSHOT - - - org.nd4j - nd4s_2.11 - - nd4s - - http://nd4j.org/ - - - - agibsonccc - Adam Gibson - adam@skymind.io - - - taisukeoe - Taisuke Oe - oeuia.t@gmail.com - - - maxpumperla - Max Pumperla - - - - - - 2.11.12 - 2.11 - - - - - org.nd4j - nd4j-api - ${nd4j.version} - - - ch.qos.logback - logback-classic - ${logback.version} - test - - - junit - junit - ${junit.version} - test - - - org.scalatest - scalatest_${scala.binary.version} - ${scalatest.version} - test - - - org.scalacheck - scalacheck_${scala.binary.version} - ${scalacheck.version} - test - - - org.scalanlp - breeze_${scala.binary.version} - ${breeze.version} - test - - - com.github.julien-truffaut - monocle-core_${scala.binary.version} - 1.4.0 - test - - - - - src/main/scala - src/test/scala - - - net.alchim31.maven - scala-maven-plugin - ${maven-scala-plugin.version} - - - - compile - testCompile - doc-jar - - - - - ${scala.version} - - -deprecation - -explaintypes - -nobootcp - -usejavacp - - - - - org.apache.maven.plugins - maven-eclipse-plugin - 2.10 - - true - - ch.epfl.lamp.sdt.core.scalabuilder - - - ch.epfl.lamp.sdt.core.scalanature - - - org.eclipse.jdt.launching.JRE_CONTAINER - - ch.epfl.lamp.sdt.launching.SCALA_CONTAINER - - - - - - org.antipathy - mvn-scalafmt - 0.7_${scalafmt.version} - - ${project.basedir}/.scalafmt.conf - - - - validate - - format - - - - - - org.apache.maven.plugins - maven-surefire-plugin - ${maven-surefire-plugin.version} - - true - - - - org.scalatest - scalatest-maven-plugin - 1.0 - - ${project.build.directory}/surefire-reports - . - WDF TestSuite.txt - ${scala.test.skip} - - - - test - - test - - - - - - pl.project13.maven - git-commit-id-plugin - - - - - - - test-nd4j-native - - - org.nd4j - nd4j-native - ${project.version} - test - - - org.deeplearning4j - dl4j-test-resources - ${dl4j-test-resources.version} - test - - - - - test-nd4j-cuda-11.0 - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - test - - - org.deeplearning4j - dl4j-test-resources - ${dl4j-test-resources.version} - test - - - - - diff --git a/contrib/attic/nd4s/project/build.properties b/contrib/attic/nd4s/project/build.properties deleted file mode 100644 index 94a324670..000000000 --- a/contrib/attic/nd4s/project/build.properties +++ /dev/null @@ -1,21 +0,0 @@ -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -sbt.version=0.13.11 diff --git a/contrib/attic/nd4s/project/plugins.sbt b/contrib/attic/nd4s/project/plugins.sbt deleted file mode 100644 index a76313593..000000000 --- a/contrib/attic/nd4s/project/plugins.sbt +++ /dev/null @@ -1,2 +0,0 @@ -addSbtPlugin("com.github.gseitz" % "sbt-release" % "1.0.5") -addSbtPlugin("com.typesafe.sbt" % "sbt-pgp" % "0.8.3") diff --git a/contrib/attic/nd4s/sbt-pom.xml b/contrib/attic/nd4s/sbt-pom.xml deleted file mode 100644 index cc3f30ef9..000000000 --- a/contrib/attic/nd4s/sbt-pom.xml +++ /dev/null @@ -1,87 +0,0 @@ - - - - 4.0.0 - org.deeplearning4j - nd4j-native-dependencies - 1.0.0-SNAPSHOT - - Minimal POM to install nd4j-native dependencies - - - 1.0.0-SNAPSHOT - - - - - sonatype-nexus-snapshots - Sonatype Nexus Snapshots - https://oss.sonatype.org/content/repositories/snapshots - - false - - - true - - - - - - - org.nd4j - nd4j-native - ${nd4j.version} - - - - - - - maven-dependency-plugin - 3.0.2 - - - install - - copy-dependencies - - - ${project.basedir}/lib - compile - false - false - true - - - - - - org.apache.maven.plugins - maven-surefire-plugin - 2.7 - - true - - - - - diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/CollectionLikeNDArray.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/CollectionLikeNDArray.scala deleted file mode 100644 index f3ac6eb3f..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/CollectionLikeNDArray.scala +++ /dev/null @@ -1,117 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4s.Implicits._ -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.api.ops.Op -import org.nd4j.linalg.factory.Nd4j -import org.nd4s.ops.{ BitFilterOps, FilterOps, FunctionalOpExecutioner, MapOps } - -import scala.language.postfixOps -import scala.util.control.Breaks._ - -/* - This provides Scala Collection like APIs such as map, filter, exist, forall. - */ -trait CollectionLikeNDArray[A <: INDArray] { - val underlying: A - - def filter(f: Double => Boolean)(implicit ev: NDArrayEvidence[A, _]): A = notCleanedUp { _ => - val shape = underlying.shape() - ev.reshape(FunctionalOpExecutioner.apply - .exec(FilterOps(ev.linearView(underlying), f): Op) - .asInstanceOf[A], - shape.map(_.toInt): _*) - } - - def filterBit(f: Double => Boolean)(implicit ev: NDArrayEvidence[A, _]): A = notCleanedUp { _ => - val shape = underlying.shape() - ev.reshape(FunctionalOpExecutioner.apply - .exec(BitFilterOps(ev.linearView(underlying), f): Op) - .asInstanceOf[A], - shape.map(_.toInt): _*) - } - - def map(f: Double => Double)(implicit ev: NDArrayEvidence[A, _]): A = notCleanedUp { _ => - val shape = underlying.shape() - ev.reshape(FunctionalOpExecutioner.apply - .exec(MapOps(ev.linearView(underlying), f): Op) - .asInstanceOf[A], - shape.map(_.toInt): _*) - } - - def notCleanedUp[B](f: INDArray => B): B = - f(underlying) - - def exists(f: Double => Boolean)(implicit ev: NDArrayEvidence[A, Double]): Boolean = existsTyped[Double](f) - - def existsTyped[B](f: B => Boolean)(implicit ev: NDArrayEvidence[A, B]): Boolean = { - var result = false - val lv = ev.linearView(underlying) - breakable { - for { - i <- 0 until lv.length().toInt - } if (!f(ev.get(lv, i))) { - result = true - break() - } - } - result - } - - def forall(f: Double => Boolean)(implicit ev: NDArrayEvidence[A, Double]): Boolean = forallTyped[Double](f) - - def forallTyped[B](f: B => Boolean)(implicit ev: NDArrayEvidence[A, B]): Boolean = { - var result = true - val lv = ev.linearView(underlying) - breakable { - for { - i <- 0 until lv.length().toInt - } if (!f(ev.get(lv, i))) { - result = false - break() - } - } - result - } - - def >[B, C](d: C)(implicit ev: NDArrayEvidence[A, B], ev2: C => B): Boolean = - forallTyped { i: B => - ev.greaterThan(i, d) - } - - def <[B, C](d: C)(implicit ev: NDArrayEvidence[A, B], ev2: C => B): Boolean = - forallTyped { i: B => - ev.lessThan(i, d) - } - - def >=[B, C](d: C)(implicit ev: NDArrayEvidence[A, B], ev2: Equality[B], ev3: C => B): Boolean = forallTyped { i: B => - ev.greaterThan(i, d) || ev2.equal(i, d) - } - - def <=[B, C](d: C)(implicit ev: NDArrayEvidence[A, B], ev2: Equality[B], ev3: C => B): Boolean = forallTyped { i: B => - ev.lessThan(i, d) || ev2.equal(i, d) - } - - def columnP: ColumnProjectedNDArray = new ColumnProjectedNDArray(underlying) - - def rowP: RowProjectedNDArray = new RowProjectedNDArray(underlying) - - def sliceP: SliceProjectedNDArray = new SliceProjectedNDArray(underlying) -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/ColumnProjectedNDArray.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/ColumnProjectedNDArray.scala deleted file mode 100644 index f16f69931..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/ColumnProjectedNDArray.scala +++ /dev/null @@ -1,54 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.indexing.{ NDArrayIndex, SpecifiedIndex } - -class ColumnProjectedNDArray(val array: INDArray, val filtered: Array[Int]) { - def this(ndarray: INDArray) { - this(ndarray, (0 until ndarray.columns()).toArray) - } - - def mapi(f: INDArray => INDArray): INDArray = { - for { - i <- filtered - } array.putColumn(i, f(array.getColumn(i))) - array.get(NDArrayIndex.all(), new SpecifiedIndex(filtered: _*)) - } - - def map(f: INDArray => INDArray): INDArray = - new ColumnProjectedNDArray(array.dup(), filtered).flatMapi(f) - - def flatMap(f: INDArray => INDArray): INDArray = map(f) - - def flatMapi(f: INDArray => INDArray): INDArray = mapi(f) - - def foreach(f: INDArray => Unit): Unit = - for { - i <- filtered - } f(array.getColumn(i)) - - def withFilter(f: INDArray => Boolean): ColumnProjectedNDArray = { - val targets = for { - i <- filtered - if f(array.getColumn(i)) - } yield i - new ColumnProjectedNDArray(array, targets) - } -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/Equality.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/Equality.scala deleted file mode 100644 index d9e6892ec..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/Equality.scala +++ /dev/null @@ -1,37 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -/** - * Created by taisukeoe on 16/02/12. - */ -trait Equality[A] { - def equal(left: A, right: A): Boolean -} -object Equality { - implicit lazy val doubleEquality = new Equality[Double] { - lazy val tolerance = 0.01D - override def equal(left: Double, right: Double): Boolean = - math.abs(left - right) < tolerance - } - implicit lazy val floatEquality = new Equality[Float] { - lazy val tolerance = 0.01F - override def equal(left: Float, right: Float): Boolean = - math.abs(left - right) < tolerance - } -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/Implicits.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/Implicits.scala deleted file mode 100644 index 7063776eb..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/Implicits.scala +++ /dev/null @@ -1,438 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.common.primitives.{ Pair, Triple } -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.indexing.{ INDArrayIndex, NDArrayIndex } - -import scala.collection.breakOut -object Implicits { - - implicit class RichINDArray[A <: INDArray](val underlying: A) - extends SliceableNDArray[A] - with OperatableNDArray[A] - with CollectionLikeNDArray[A] - - implicit def rowProjection2NDArray(row: RowProjectedNDArray): INDArray = - row.array - - implicit def columnProjection2NDArray(column: ColumnProjectedNDArray): INDArray = column.array - - implicit def sliceProjection2NDArray(sliced: SliceProjectedNDArray): INDArray = sliced.array - - /* - Avoid using Numeric[T].toDouble(t:T) for sequence transformation in XXColl2INDArray to minimize memory consumption. - */ - - implicit def floatArray2INDArray(s: Array[Float]): FloatArray2INDArray = - new FloatArray2INDArray(s) - implicit def floatColl2INDArray(s: Seq[Float]): FloatArray2INDArray = - new FloatArray2INDArray(s.toArray) - implicit def jfloatColl2INDArray(s: Seq[java.lang.Float]): FloatArray2INDArray = - new FloatArray2INDArray(s.map(x => x: Float)(breakOut)) - class FloatArray2INDArray(val underlying: Array[Float]) extends AnyVal { - def mkNDArray(shape: Array[Int], ord: NDOrdering = NDOrdering(Nd4j.order())): INDArray = - Nd4j.create(underlying, shape, ord.value) - - def asNDArray(shape: Int*): INDArray = - Nd4j.create(underlying.toArray, shape.toArray) - - def toNDArray: INDArray = Nd4j.create(underlying) - } - - implicit def doubleArray2INDArray(s: Array[Double]): DoubleArray2INDArray = - new DoubleArray2INDArray(s) - implicit def doubleArray2CollArray(s: Seq[Double]): DoubleArray2INDArray = - new DoubleArray2INDArray(s.toArray) - implicit def jdoubleColl2INDArray(s: Seq[java.lang.Double]): DoubleArray2INDArray = - new DoubleArray2INDArray(s.map(x => x: Double)(breakOut)) - class DoubleArray2INDArray(val underlying: Array[Double]) extends AnyVal { - def mkNDArray(shape: Array[Int], ord: NDOrdering = NDOrdering(Nd4j.order()), offset: Int = 0): INDArray = - Nd4j.create(underlying, shape, offset, ord.value) - - def asNDArray(shape: Int*): INDArray = - Nd4j.create(underlying.toArray, shape.toArray) - - def toNDArray: INDArray = Nd4j.create(underlying) - } - - implicit def intColl2INDArray(s: Seq[Int]): IntArray2INDArray = - new IntArray2INDArray(s.toArray) - implicit def intArray2INDArray(s: Array[Int]): IntArray2INDArray = - new IntArray2INDArray(s) - implicit def jintColl2INDArray(s: Seq[java.lang.Integer]): IntArray2INDArray = - new IntArray2INDArray(s.map(x => x: Int)(breakOut)) - class IntArray2INDArray(val underlying: Array[Int]) extends AnyVal { - def mkNDArray(shape: Array[Int], ord: NDOrdering = NDOrdering(Nd4j.order()), offset: Int = 0): INDArray = { - val strides = Nd4j.getStrides(shape, ord.value) - Nd4j.create(underlying.map(_.toInt), shape.map(_.toLong), strides.map(_.toLong), ord.value, DataType.INT) - } - - def toNDArray: INDArray = Nd4j.createFromArray(underlying: _*) - } - - implicit def longColl2INDArray(s: Seq[Long]): LongArray2INDArray = - new LongArray2INDArray(s.toArray) - implicit def longArray2INDArray(s: Array[Long]): LongArray2INDArray = - new LongArray2INDArray(s) - implicit def jlongColl2INDArray(s: Seq[java.lang.Long]): LongArray2INDArray = - new LongArray2INDArray(s.map(x => x: Long)(breakOut)) - class LongArray2INDArray(val underlying: Array[Long]) extends AnyVal { - def mkNDArray(shape: Array[Int], ord: NDOrdering = NDOrdering(Nd4j.order()), offset: Int = 0): INDArray = { - val strides = Nd4j.getStrides(shape, ord.value) - Nd4j.create(underlying, shape.map(_.toLong), strides.map(_.toLong), ord.value, DataType.LONG) - } - - def toNDArray: INDArray = Nd4j.createFromArray(underlying: _*) - } - - implicit def shortColl2INDArray(s: Seq[Short]): ShortArray2INDArray = - new ShortArray2INDArray(s.toArray) - implicit def shortArray2INDArray(s: Array[Short]): ShortArray2INDArray = - new ShortArray2INDArray(s) - implicit def jshortColl2INDArray(s: Seq[java.lang.Short]): ShortArray2INDArray = - new ShortArray2INDArray(s.map(x => x: Short)(breakOut)) - class ShortArray2INDArray(val underlying: Array[Short]) extends AnyVal { - def mkNDArray(shape: Array[Int], ord: NDOrdering = NDOrdering(Nd4j.order()), offset: Int = 0): INDArray = { - val strides = Nd4j.getStrides(shape, ord.value) - Nd4j.create(underlying, shape.map(_.toLong), strides.map(_.toLong), ord.value, DataType.SHORT) - } - - def toNDArray: INDArray = Nd4j.createFromArray(underlying: _*) - } - - implicit def byteColl2INDArray(s: Seq[Byte]): ByteArray2INDArray = - new ByteArray2INDArray(s.toArray) - implicit def byteArray2INDArray(s: Array[Byte]): ByteArray2INDArray = - new ByteArray2INDArray(s) - implicit def jbyteColl2INDArray(s: Seq[java.lang.Byte]): ByteArray2INDArray = - new ByteArray2INDArray(s.map(x => x: Byte)(breakOut)) - class ByteArray2INDArray(val underlying: Array[Byte]) extends AnyVal { - def mkNDArray(shape: Array[Int], ord: NDOrdering = NDOrdering(Nd4j.order()), offset: Int = 0): INDArray = { - val strides = Nd4j.getStrides(shape, ord.value) - Nd4j.create(underlying, shape.map(_.toLong), strides.map(_.toLong), ord.value, DataType.BYTE) - } - - def toNDArray: INDArray = Nd4j.createFromArray(underlying: _*) - } - - implicit def booleanColl2INDArray(s: Seq[Boolean]): BooleanArray2INDArray = - new BooleanArray2INDArray(s.toArray) - implicit def booleanArray2INDArray(s: Array[Boolean]): BooleanArray2INDArray = - new BooleanArray2INDArray(s) - implicit def jbooleanColl2INDArray(s: Seq[java.lang.Boolean]): BooleanArray2INDArray = - new BooleanArray2INDArray(s.map(x => x: Boolean)(breakOut)) - class BooleanArray2INDArray(val underlying: Array[Boolean]) extends AnyVal { - def mkNDArray(shape: Array[Int], ord: NDOrdering = NDOrdering(Nd4j.order()), offset: Int = 0): INDArray = { - val strides = Nd4j.getStrides(shape, ord.value) - Nd4j.create(underlying, shape.map(_.toLong), strides.map(_.toLong), ord.value, DataType.BOOL) - } - - def toNDArray: INDArray = Nd4j.createFromArray(underlying: _*) - } - - implicit def stringArray2INDArray(s: Array[String]): StringArray2INDArray = - new StringArray2INDArray(s) - implicit def stringArray2CollArray(s: Seq[String]): StringArray2INDArray = - new StringArray2INDArray(s.toArray) - implicit def jstringColl2INDArray(s: Seq[java.lang.String]): StringArray2INDArray = - new StringArray2INDArray(s.map(x => x: String)(breakOut)) - class StringArray2INDArray(val underlying: Array[String]) extends AnyVal { - def mkNDArray(shape: Array[Int], ord: NDOrdering = NDOrdering(Nd4j.order()), offset: Int = 0): INDArray = ??? - - def asNDArray(shape: Int*): INDArray = ??? - - def toNDArray: INDArray = Nd4j.create(underlying: _*) - } - - implicit class FloatMatrix2INDArray(val underlying: Seq[Seq[Float]]) extends AnyVal { - def mkNDArray(ord: NDOrdering): INDArray = - Nd4j.create(underlying.map(_.toArray).toArray, ord.value) - def toNDArray: INDArray = Nd4j.create(underlying.map(_.toArray).toArray) - } - - implicit class FloatArrayMatrix2INDArray(val underlying: Array[Array[Float]]) extends AnyVal { - def mkNDArray(ord: NDOrdering): INDArray = - Nd4j.create(underlying, ord.value) - def toNDArray: INDArray = Nd4j.create(underlying) - } - - implicit class DoubleMatrix2INDArray(val underlying: Seq[Seq[Double]]) extends AnyVal { - def mkNDArray(ord: NDOrdering): INDArray = - Nd4j.create(underlying.map(_.toArray).toArray, ord.value) - def toNDArray: INDArray = Nd4j.create(underlying.map(_.toArray).toArray) - } - - implicit class DoubleArrayMatrix2INDArray(val underlying: Array[Array[Double]]) extends AnyVal { - def mkNDArray(ord: NDOrdering): INDArray = - Nd4j.create(underlying, ord.value) - def toNDArray: INDArray = Nd4j.create(underlying) - } - - implicit class IntMatrix2INDArray(val underlying: Seq[Seq[Int]]) extends AnyVal { - def mkNDArray(ord: NDOrdering): INDArray = - Nd4j.createFromArray(underlying.map(_.toArray).toArray) - def toNDArray: INDArray = - Nd4j.createFromArray(underlying.map(_.toArray).toArray) - } - - implicit class IntArrayMatrix2INDArray(val underlying: Array[Array[Int]]) extends AnyVal { - def mkNDArray(ord: NDOrdering): INDArray = - Nd4j.createFromArray(underlying.map(_.toArray).toArray) - def toNDArray: INDArray = Nd4j.createFromArray(underlying.map(_.toArray).toArray) - } - - implicit class LongMatrix2INDArray(val underlying: Seq[Seq[Long]]) extends AnyVal { - def mkNDArray(ord: NDOrdering): INDArray = - Nd4j.createFromArray(underlying.map(_.toArray).toArray) - def toNDArray: INDArray = - Nd4j.createFromArray(underlying.map(_.toArray).toArray) - } - - implicit class LongArrayMatrix2INDArray(val underlying: Array[Array[Long]]) extends AnyVal { - def mkNDArray(ord: NDOrdering): INDArray = - Nd4j.createFromArray(underlying.map(_.toArray).toArray) - def toNDArray: INDArray = Nd4j.createFromArray(underlying.map(_.toArray).toArray) - } - - /*implicit class Num2Scalar[T](val underlying: T)(implicit ev: Numeric[T]) { - def toScalar: INDArray = Nd4j.scalar(ev.toDouble(underlying)) - }*/ - - // TODO: move ops to single trait - implicit class Float2Scalar(val underlying: Float) { - def +(x: INDArray) = underlying.toScalar + x - def *(x: INDArray) = underlying.toScalar * x - def /(x: INDArray) = underlying.toScalar / x - def \(x: INDArray) = underlying.toScalar \ x - def toScalar: INDArray = Nd4j.scalar(underlying) - } - - implicit class Double2Scalar(val underlying: Double) { - def +(x: INDArray) = underlying.toScalar + x - def *(x: INDArray) = underlying.toScalar * x - def /(x: INDArray) = underlying.toScalar / x - def \(x: INDArray) = underlying.toScalar \ x - def toScalar: INDArray = Nd4j.scalar(underlying) - } - - implicit class Long2Scalar(val underlying: Long) { - def +(x: INDArray) = underlying.toScalar + x - def *(x: INDArray) = underlying.toScalar * x - def /(x: INDArray) = underlying.toScalar / x - def \(x: INDArray) = underlying.toScalar \ x - def toScalar: INDArray = Nd4j.scalar(underlying) - } - - implicit class Int2Scalar(val underlying: Int) { - def +(x: INDArray) = underlying.toScalar + x - def *(x: INDArray) = underlying.toScalar * x - def /(x: INDArray) = underlying.toScalar / x - def \(x: INDArray) = underlying.toScalar \ x - def toScalar: INDArray = Nd4j.scalar(underlying) - } - - implicit class Byte2Scalar(val underlying: Byte) { - def +(x: INDArray) = underlying.toScalar + x - def *(x: INDArray) = underlying.toScalar * x - def /(x: INDArray) = underlying.toScalar / x - def \(x: INDArray) = underlying.toScalar \ x - def toScalar: INDArray = Nd4j.scalar(underlying) - } - - implicit class Boolean2Scalar(val underlying: Boolean) { - def +(x: INDArray) = underlying.toScalar + x - def *(x: INDArray) = underlying.toScalar * x - def toScalar: INDArray = Nd4j.scalar(underlying) - } - - implicit class String2Scalar(val underlying: String) { - def toScalar: INDArray = Nd4j.scalar(underlying) - } - - implicit def intArray2IndexRangeArray(arr: Array[Int]): Array[IndexRange] = - arr.map(new IntRange(_)) - - case object -> extends IndexRange { - override def hasNegative: Boolean = false - } - - case object ---> extends IndexRange { - override def hasNegative: Boolean = false - } - - case object --- extends IndexRange { - override def hasNegative: Boolean = false - } - - implicit class IntRange(val underlying: Int) extends IndexNumberRange { - protected[nd4s] override def asRange(max: => Int): DRange = - DRange(underlying, underlying, true, 1, max) - - override protected[nd4s] def asNDArrayIndex(max: => Int): INDArrayIndex = - NDArrayIndex.point(underlying) - - override def hasNegative: Boolean = false - - override def toString: String = s"$underlying" - } - - implicit class TupleRange(val underlying: _root_.scala.Tuple2[Int, Int]) extends IndexNumberRange { - protected[nd4s] override def asRange(max: => Int): DRange = - DRange(underlying._1, underlying._2, false, 1, max) - - override protected[nd4s] def asNDArrayIndex(max: => Int): INDArrayIndex = - IndexNumberRange.toNDArrayIndex(underlying._1, underlying._2, false, 1, max) - - override def toString: String = s"${underlying._1}->${underlying._2}" - - override def hasNegative: Boolean = underlying._1 < 0 || underlying._2 < 0 - - def by(i: Int) = - new IndexRangeWrapper(underlying._1 until underlying._2 by i) - } - - implicit class IntRangeFromGen(val underlying: Int) extends AnyVal { - def -> = IntRangeFrom(underlying) - } - - implicit class IntRangeFromGen1(val underlying: Int) extends AnyVal { - def :: = IntRangeFromReverse(underlying) - } - - implicit class IndexRangeWrapper(val underlying: Range) extends IndexNumberRange { - protected[nd4s] override def asRange(max: => Int): DRange = - DRange.from(underlying, max) - - override protected[nd4s] def asNDArrayIndex(max: => Int): INDArrayIndex = - IndexNumberRange.toNDArrayIndex(underlying.start, underlying.end, underlying.isInclusive, underlying.step, max) - - override def toString: String = - s"${underlying.start}->${underlying.end} by ${underlying.step}" - - override def hasNegative: Boolean = - underlying.start < 0 || underlying.end < 0 || underlying.step < 0 - } - - implicit class NDArrayIndexWrapper(val underlying: INDArrayIndex) extends IndexNumberRange { - protected[nd4s] override def asRange(max: => Int): DRange = - DRange(underlying.offset().asInstanceOf[Int], - underlying.end().asInstanceOf[Int], - false, - underlying.stride().asInstanceOf[Int], - max) - - override protected[nd4s] def asNDArrayIndex(max: => Int): INDArrayIndex = - underlying - - override def toString: String = - s"${underlying.offset()}->${underlying.end} by ${underlying.stride}" - - override def hasNegative: Boolean = false - } - - lazy val NDOrdering = org.nd4s.NDOrdering - - lazy val i = new ImaginaryNumber[Integer](1) - - implicit class Pair2Tuple[T, U](a: Pair[T, U]) { - def asScala: (T, U) = (a.getFirst, a.getSecond) - } - - implicit class Triple2Tuple[T, U, V](a: Triple[T, U, V]) { - def asScala: (T, U, V) = (a.getFirst, a.getSecond, a.getThird) - } - - implicit class Tuple2Pair[T, U](a: (T, U)) { - def toPair: Pair[T, U] = - new Pair(a._1, a._2) - } - - implicit class Tuple2Triple[T, U, V](a: (T, U, V)) { - def toTriple: Triple[T, U, V] = - new Triple(a._1, a._2, a._3) - } -} - -private[nd4s] class ImaginaryNumber[T <: Number](val value: T) extends AnyVal { - override def toString: String = s"${value}i" -} - -sealed trait IndexNumberRange extends IndexRange { - protected[nd4s] def asRange(max: => Int): DRange - protected[nd4s] def asNDArrayIndex(max: => Int): INDArrayIndex -} -object IndexNumberRange { - def toNDArrayIndex(startR: Int, endR: Int, isInclusive: Boolean, step: Int, max: Int): INDArrayIndex = { - val (start, end) = { - val start = if (startR >= 0) startR else max + startR - val diff = if (isInclusive) 1 else 0 - val endExclusive = if (endR >= 0) endR + diff else max + endR + diff - (start, endExclusive) - } - NDArrayIndex.interval(start, step, end, false) - } -} - -/*sealed*/ -trait IndexRange { - def hasNegative: Boolean -} - -case class IntRangeFrom(underlying: Int) extends IndexRange { - def apply[T](a: T): (Int, T) = - (underlying, a) - - override def toString: String = s"$underlying->" - - override def hasNegative: Boolean = false -} - -case class IntRangeFromReverse(underlying: Int) extends IndexRange { - def apply[T](a: T): (T, Int) = - (a, underlying) - - override def toString: String = s"$underlying->" - - override def hasNegative: Boolean = false -} - -private[nd4s] case class DRange(startR: Int, endR: Int, isInclusive: Boolean, step: Int, max: Int) { - lazy val (start, end) = { - val start = if (startR >= 0) startR else max + startR - val diff = if (isInclusive) 0 else if (step >= 0) -1 else +1 - val endInclusive = if (endR >= 0) endR + diff else max + endR + diff - (start, endInclusive) - } - lazy val length = (end - start) / step + 1 - - def toList: List[Int] = List.iterate(start, length)(_ + step) - - override def toString: String = s"[$start to $end by $step len:$length]" -} - -private[nd4s] object DRange extends { - def from(r: Range, max: => Int): DRange = - DRange(r.start, r.end, r.isInclusive, r.step, max) - - def apply(startR: Int, endR: Int, step: Int): DRange = - DRange(startR, endR, false, step, Int.MinValue) -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/NDArrayEvidence.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/NDArrayEvidence.scala deleted file mode 100644 index 68fbdb2fa..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/NDArrayEvidence.scala +++ /dev/null @@ -1,560 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.indexing.INDArrayIndex -import org.nd4s.Implicits._ - -object Evidences { - implicit val double = DoubleNDArrayEvidence - implicit val float = FloatNDArrayEvidence - implicit val int = IntNDArrayEvidence - implicit val long = LongNDArrayEvidence - implicit val byte = ByteNDArrayEvidence -} - -object NDArrayEvidence { - implicit val doubleNDArrayEvidence = DoubleNDArrayEvidence - -} - -trait NDArrayEvidence[NDArray <: INDArray, Value] { - - def sum(ndarray: NDArray): Value - - def mean(ndarray: NDArray): Value - - def normMax(ndarray: NDArray): Value - - def norm1(ndarray: NDArray): Value - - def norm2(ndarray: NDArray): Value - - def max(ndarray: NDArray): Value - - def min(ndarray: NDArray): Value - - def standardDeviation(ndarray: NDArray): Value - - def product(ndarray: NDArray): Value - - def variance(ndarray: NDArray): Value - - def remainder(a: NDArray, that: INDArray): NDArray - - def add(a: NDArray, that: INDArray): NDArray - - def sub(a: NDArray, that: INDArray): NDArray - - def mul(a: NDArray, that: INDArray): NDArray - - def mmul(a: NDArray, that: INDArray): NDArray - - def div(a: NDArray, that: INDArray): NDArray - - def rdiv(a: NDArray, that: INDArray): NDArray - - def addi(a: NDArray, that: INDArray): NDArray - - def subi(a: NDArray, that: INDArray): NDArray - - def muli(a: NDArray, that: INDArray): NDArray - - def mmuli(a: NDArray, that: INDArray): NDArray - - def remainderi(a: NDArray, that: INDArray): NDArray - - def remainderi(a: NDArray, that: Number): NDArray - - def divi(a: NDArray, that: INDArray): NDArray - - def rdivi(a: NDArray, that: INDArray): NDArray - - def remainder(a: NDArray, that: Number): NDArray - - def add(a: NDArray, that: Number): NDArray - - def sub(a: NDArray, that: Number): NDArray - - def mul(a: NDArray, that: Number): NDArray - - def div(a: NDArray, that: Number): NDArray - - def rdiv(a: NDArray, that: Number): NDArray - - def addi(a: NDArray, that: Number): NDArray - - def subi(a: NDArray, that: Number): NDArray - - def muli(a: NDArray, that: Number): NDArray - - def divi(a: NDArray, that: Number): NDArray - - def rdivi(a: NDArray, that: Number): NDArray - - def put(a: NDArray, i: Int, element: INDArray): NDArray - - def put(a: NDArray, i: Array[Int], element: INDArray): NDArray - - def get(a: NDArray, i: Long): Value - - //def get(a: NDArray, i: Long, j: Long): Value - - def get(a: NDArray, i: Long*): Value - - def get(a: NDArray, i: INDArrayIndex*): NDArray - - def reshape(a: NDArray, i: Int*): NDArray - - def linearView(a: NDArray): NDArray - - def dup(a: NDArray): NDArray - - def create(arr: Array[Value]): NDArray - - def create(arr: Array[Value], shape: Int*): NDArray - - def create(arr: Array[Value], shape: Array[Int], ordering: NDOrdering, offset: Int): NDArray - - def update(a: NDArray, indices: Array[IndexRange], i: Value): NDArray - - def update(a: NDArray, indices: Array[IndexRange], i: NDArray): NDArray - - def greaterThan(left: Value, right: Value): Boolean - - def lessThan(left: Value, right: Value): Boolean -} - -trait RealNDArrayEvidence[Value] extends NDArrayEvidence[INDArray, Value] { - - override def remainder(a: INDArray, that: INDArray): INDArray = - a.remainder(that) - - override def add(a: INDArray, that: INDArray): INDArray = a.add(that) - - override def div(a: INDArray, that: INDArray): INDArray = a.div(that) - - override def mul(a: INDArray, that: INDArray): INDArray = a.mul(that) - - override def rdiv(a: INDArray, that: INDArray): INDArray = a.rdiv(that) - - override def sub(a: INDArray, that: INDArray): INDArray = a.sub(that) - - override def mmul(a: INDArray, that: INDArray): INDArray = a.mmul(that) - - override def addi(a: INDArray, that: INDArray): INDArray = a.addi(that) - - override def subi(a: INDArray, that: INDArray): INDArray = a.subi(that) - - override def muli(a: INDArray, that: INDArray): INDArray = a.muli(that) - - override def mmuli(a: INDArray, that: INDArray): INDArray = a.mmuli(that) - - override def remainderi(a: INDArray, that: INDArray): INDArray = - a.remainderi(that) - - override def remainderi(a: INDArray, that: Number): INDArray = - a.remainderi(that) - - override def divi(a: INDArray, that: INDArray): INDArray = a.divi(that) - - override def rdivi(a: INDArray, that: INDArray): INDArray = a.rdivi(that) - - override def remainder(a: INDArray, that: Number): INDArray = - a.remainder(that) - - override def add(a: INDArray, that: Number): INDArray = a.add(that) - - override def sub(a: INDArray, that: Number): INDArray = a.sub(that) - - override def mul(a: INDArray, that: Number): INDArray = a.mul(that) - - override def div(a: INDArray, that: Number): INDArray = a.div(that) - - override def rdiv(a: INDArray, that: Number): INDArray = a.rdiv(that) - - override def addi(a: INDArray, that: Number): INDArray = a.addi(that) - - override def subi(a: INDArray, that: Number): INDArray = a.subi(that) - - override def muli(a: INDArray, that: Number): INDArray = a.muli(that) - - override def divi(a: INDArray, that: Number): INDArray = a.divi(that) - - override def rdivi(a: INDArray, that: Number): INDArray = a.rdivi(that) - - override def put(a: INDArray, i: Array[Int], element: INDArray): INDArray = - a.put(i, element) - - override def put(a: INDArray, i: Int, element: INDArray): INDArray = - a.put(i, element) - - override def get(a: INDArray, i: INDArrayIndex*): INDArray = a.get(i: _*) - - override def reshape(a: INDArray, i: Int*): INDArray = - a.reshape(i.map(_.toLong): _*) - - override def linearView(a: INDArray): INDArray = a.reshape(-1) - - override def dup(a: INDArray): INDArray = a.dup() - - override def update(underlying: INDArray, ir: Array[IndexRange], num: INDArray): INDArray = { - if (ir.exists(_.hasNegative)) - underlying.indicesFrom(ir: _*).indices.foreach { i => - underlying.put(i, num) - } else - underlying.put(underlying.getINDArrayIndexfrom(ir: _*).toArray, num) - underlying - } -} - -case object DoubleNDArrayEvidence extends RealNDArrayEvidence[Double] { - - override def sum(ndarray: INDArray): Double = - ndarray.sumNumber().doubleValue() - - override def mean(ndarray: INDArray): Double = - ndarray.meanNumber().doubleValue() - - override def variance(ndarray: INDArray): Double = - ndarray.varNumber().doubleValue() - - override def norm2(ndarray: INDArray): Double = - ndarray.norm2Number().doubleValue() - - override def max(ndarray: INDArray): Double = - ndarray.maxNumber().doubleValue() - - override def product(ndarray: INDArray): Double = - ndarray.prodNumber().doubleValue() - - override def standardDeviation(ndarray: INDArray): Double = - ndarray.stdNumber().doubleValue() - - override def normMax(ndarray: INDArray): Double = - ndarray.normmaxNumber().doubleValue() - - override def min(ndarray: INDArray): Double = - ndarray.minNumber().doubleValue() - - override def norm1(ndarray: INDArray): Double = - ndarray.norm1Number().doubleValue() - - override def get(a: INDArray, i: Long): Double = a.getDouble(i) - - //override def get(a: INDArray, i: Int): Double = a.getDouble(i.toLong) - - //override def get(a: INDArray, i: Int, j: Int): Double = a.getDouble(i.toLong, j.toLong) - - //override def get(a: INDArray, i: Int*): Double = a.getDouble(i: _*) - - override def get(a: INDArray, i: Long*): Double = a.getDouble(i: _*) - - override def create(arr: Array[Double]): INDArray = arr.toNDArray - - override def create(arr: Array[Double], shape: Int*): INDArray = - arr.asNDArray(shape: _*) - - override def create(arr: Array[Double], shape: Array[Int], ordering: NDOrdering, offset: Int): INDArray = - arr.mkNDArray(shape, ordering, offset) - - override def update(underlying: INDArray, ir: Array[IndexRange], num: Double): INDArray = { - if (ir.length == 1 && !ir.head.hasNegative && ir.head - .isInstanceOf[IntRange]) - underlying.putScalar(ir.head.asInstanceOf[IntRange].underlying, num) - else if (ir.exists(_.hasNegative)) - underlying.indicesFrom(ir: _*).indices.foreach { i => - underlying.putScalar(i, num) - } else - underlying.put(underlying.getINDArrayIndexfrom(ir: _*).toArray, num) - underlying - } - - override def greaterThan(left: Double, right: Double): Boolean = left > right - - override def lessThan(left: Double, right: Double): Boolean = left < right - -} - -case object FloatNDArrayEvidence extends RealNDArrayEvidence[Float] { - - override def sum(ndarray: INDArray): Float = ndarray.sumNumber().floatValue() - - override def mean(ndarray: INDArray): Float = - ndarray.meanNumber().floatValue() - - override def variance(ndarray: INDArray): Float = - ndarray.varNumber().floatValue() - - override def norm2(ndarray: INDArray): Float = - ndarray.norm2Number().floatValue() - - override def max(ndarray: INDArray): Float = ndarray.maxNumber().floatValue() - - override def product(ndarray: INDArray): Float = - ndarray.prodNumber().floatValue() - - override def standardDeviation(ndarray: INDArray): Float = - ndarray.stdNumber().floatValue() - - override def normMax(ndarray: INDArray): Float = - ndarray.normmaxNumber().floatValue() - - override def min(ndarray: INDArray): Float = ndarray.minNumber().floatValue() - - override def norm1(ndarray: INDArray): Float = - ndarray.norm1Number().floatValue() - - override def get(a: INDArray, i: Long): Float = a.getFloat(i) - - //override def get(a: INDArray, i: Long, j: Long): Float = a.getFloat(i, j) - - //override def get(a: INDArray, i: Int*): Float = a.getFloat(i: _*) - - override def get(a: INDArray, i: Long*): Float = a.getFloat(i: _*) - - override def create(arr: Array[Float]): INDArray = arr.toNDArray - - override def create(arr: Array[Float], shape: Int*): INDArray = - arr.asNDArray(shape: _*) - - override def create(arr: Array[Float], shape: Array[Int], ordering: NDOrdering, offset: Int): INDArray = - arr.mkNDArray(shape, ordering) - - override def update(underlying: INDArray, ir: Array[IndexRange], num: Float): INDArray = { - if (ir.length == 1 && !ir.head.hasNegative && ir.head - .isInstanceOf[IntRange]) - underlying.putScalar(ir.head.asInstanceOf[IntRange].underlying, num) - else if (ir.exists(_.hasNegative)) - underlying.indicesFrom(ir: _*).indices.foreach { i => - underlying.putScalar(i, num) - } else - underlying.put(underlying.getINDArrayIndexfrom(ir: _*).toArray, num) - underlying - } - - override def greaterThan(left: Float, right: Float): Boolean = left > right - - override def lessThan(left: Float, right: Float): Boolean = left < right -} - -trait IntegerNDArrayEvidence[Value] extends NDArrayEvidence[INDArray, Value] { - def remainder(a: INDArray, that: INDArray): INDArray = a.remainder(that) - - def add(a: INDArray, that: INDArray): INDArray = a.add(that) - - def sub(a: INDArray, that: INDArray): INDArray = a.sub(that) - - def mul(a: INDArray, that: INDArray): INDArray = a.mul(that) - - def mmul(a: INDArray, that: INDArray): INDArray = a.mmul(that) - - def div(a: INDArray, that: INDArray): INDArray = a.div(that) - - def rdiv(a: INDArray, that: INDArray): INDArray = a.rdiv(that) - - def addi(a: INDArray, that: INDArray): INDArray = a.addi(that) - - def subi(a: INDArray, that: INDArray): INDArray = a.subi(that) - - def muli(a: INDArray, that: INDArray): INDArray = a.muli(that) - - def mmuli(a: INDArray, that: INDArray): INDArray = a.mmuli(that) - - def remainderi(a: INDArray, that: INDArray): INDArray = a.remainder(that) - - def remainderi(a: INDArray, that: Number): INDArray = a.remainderi(that) - - def divi(a: INDArray, that: INDArray): INDArray = a.divi(that) - - def rdivi(a: INDArray, that: INDArray): INDArray = a.rdivi(that) - - def remainder(a: INDArray, that: Number): INDArray = a.remainder(that) - - def add(a: INDArray, that: Number): INDArray = a.add(that) - - def sub(a: INDArray, that: Number): INDArray = a.sub(that) - - def mul(a: INDArray, that: Number): INDArray = a.mul(that) - - def div(a: INDArray, that: Number): INDArray = a.div(that) - - def rdiv(a: INDArray, that: Number): INDArray = a.rdiv(that) - - def addi(a: INDArray, that: Number): INDArray = a.addi(that) - - def subi(a: INDArray, that: Number): INDArray = a.subi(that) - - def muli(a: INDArray, that: Number): INDArray = a.muli(that) - - def divi(a: INDArray, that: Number): INDArray = a.divi(that) - - def rdivi(a: INDArray, that: Number): INDArray = a.rdivi(that) - - def put(a: INDArray, i: Int, element: INDArray): INDArray = a.put(i, element) - - def put(a: INDArray, i: Array[Int], element: INDArray): INDArray = a.put(i, element) - - def get(a: INDArray, i: INDArrayIndex*): INDArray = a.get(i: _*) - - def reshape(a: INDArray, i: Int*): INDArray = a.reshape(i.map(_.toLong): _*) - - def linearView(a: INDArray): INDArray = a.reshape(-1) - - def dup(a: INDArray): INDArray = a.dup() - - def update(a: INDArray, indices: Array[IndexRange], i: Int): INDArray = - a.update(indices, i) - - def update(a: INDArray, indices: Array[IndexRange], i: INDArray): INDArray = - a.update(indices, i) -} - -case object IntNDArrayEvidence extends IntegerNDArrayEvidence[Int] { - - def sum(ndarray: INDArray): Int = ndarray.sumNumber().intValue() - - def mean(ndarray: INDArray): Int = ndarray.meanNumber().intValue() - - def normMax(ndarray: INDArray): Int = ndarray.normmaxNumber().intValue() - - def norm1(ndarray: INDArray): Int = ndarray.norm1Number().intValue() - - def norm2(ndarray: INDArray): Int = ndarray.norm2Number().intValue() - - def max(ndarray: INDArray): Int = ndarray.maxNumber().intValue() - - def min(ndarray: INDArray): Int = ndarray.minNumber().intValue() - - def standardDeviation(ndarray: INDArray): Int = ndarray.stdNumber().intValue() - - def product(ndarray: INDArray): Int = ndarray.prodNumber().intValue() - - def variance(ndarray: INDArray): Int = ndarray.varNumber().intValue() - - def get(a: INDArray, i: Long): Int = a.getInt(i.toInt) - - def get(a: INDArray, i: Int): Int = a.getInt(i) - - def get(a: INDArray, i: Int, j: Int): Int = a.getInt(i, j) - - def get(a: INDArray, i: Long*): Int = - a.getInt(i.map(_.toInt): _*) - - def create(arr: Array[Int]): INDArray = arr.toNDArray - - def create(arr: Array[Int], shape: Int*): INDArray = arr.toNDArray - - def create(arr: Array[Int], shape: Array[Int], ordering: NDOrdering, offset: Int): INDArray = - arr.mkNDArray(shape, ordering, offset) - - def greaterThan(left: Int, right: Int): Boolean = left > right - - def lessThan(left: Int, right: Int): Boolean = left < right -} - -case object LongNDArrayEvidence extends IntegerNDArrayEvidence[Long] { - - def sum(ndarray: INDArray): Long = ndarray.sumNumber().longValue() - - def mean(ndarray: INDArray): Long = ndarray.meanNumber().longValue() - - def normMax(ndarray: INDArray): Long = ndarray.normmaxNumber().longValue() - - def norm1(ndarray: INDArray): Long = ndarray.norm1Number().longValue() - - def norm2(ndarray: INDArray): Long = ndarray.norm2Number().longValue() - - def max(ndarray: INDArray): Long = ndarray.maxNumber().longValue() - - def min(ndarray: INDArray): Long = ndarray.minNumber().longValue() - - def standardDeviation(ndarray: INDArray): Long = ndarray.stdNumber().longValue() - - def product(ndarray: INDArray): Long = ndarray.prodNumber().longValue() - - def variance(ndarray: INDArray): Long = ndarray.varNumber().longValue() - - def get(a: INDArray, i: Long): Long = a.getLong(i) - - def get(a: INDArray, i: Int): Long = a.getLong(i) - - def get(a: INDArray, i: Int, j: Int): Long = a.getLong(i, j) - - def get(a: INDArray, i: Long*): Long = a.getLong(i: _*) - - def create(arr: Array[Long]): INDArray = arr.toNDArray - - def create(arr: Array[Long], shape: Int*): INDArray = arr.toNDArray - - def create(arr: Array[Long], shape: Array[Int], ordering: NDOrdering, offset: Int): INDArray = - arr.mkNDArray(shape, ordering, offset) - - def greaterThan(left: Long, right: Long): Boolean = left > right - - def lessThan(left: Long, right: Long): Boolean = left < right - - def update(a: INDArray, indices: Array[IndexRange], i: Long): INDArray = - a.update(indices, i) -} - -case object ByteNDArrayEvidence extends IntegerNDArrayEvidence[Byte] { - - def sum(ndarray: INDArray): Byte = ndarray.sumNumber().byteValue() - - def mean(ndarray: INDArray): Byte = ndarray.meanNumber().byteValue() - - def normMax(ndarray: INDArray): Byte = ndarray.normmaxNumber().byteValue() - - def norm1(ndarray: INDArray): Byte = ndarray.norm1Number().byteValue() - - def norm2(ndarray: INDArray): Byte = ndarray.norm2Number().byteValue() - - def max(ndarray: INDArray): Byte = ndarray.maxNumber().byteValue() - - def min(ndarray: INDArray): Byte = ndarray.minNumber().byteValue() - - def standardDeviation(ndarray: INDArray): Byte = ndarray.stdNumber().byteValue() - - def product(ndarray: INDArray): Byte = ndarray.prodNumber().byteValue() - - def variance(ndarray: INDArray): Byte = ndarray.varNumber().byteValue() - - def get(a: INDArray, i: Long): Byte = a.getInt(i.toInt).toByte - - def get(a: INDArray, i: Int): Byte = a.getInt(i).toByte - - def get(a: INDArray, i: Int, j: Int): Byte = a.getInt(i, j).toByte - - def get(a: INDArray, i: Long*): Byte = a.getInt(i.map(_.toInt): _*).toByte - - def create(arr: Array[Byte]): INDArray = arr.toNDArray - - def create(arr: Array[Byte], shape: Int*): INDArray = arr.toNDArray - - def create(arr: Array[Byte], shape: Array[Int], ordering: NDOrdering, offset: Int): INDArray = - arr.mkNDArray(shape, ordering, offset) - - def greaterThan(left: Byte, right: Byte): Boolean = left > right - - def lessThan(left: Byte, right: Byte): Boolean = left < right - - def update(a: INDArray, indices: Array[IndexRange], i: Byte): INDArray = - a.update(indices, i) -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/NDOrdering.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/NDOrdering.scala deleted file mode 100644 index 0323ee5d1..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/NDOrdering.scala +++ /dev/null @@ -1,38 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.linalg.factory.NDArrayFactory - -sealed trait NDOrdering { - val value: Char -} -object NDOrdering { - case object Fortran extends NDOrdering { - override val value: Char = NDArrayFactory.FORTRAN - } - case object C extends NDOrdering { - override val value: Char = NDArrayFactory.C - } - def apply(char: Char): NDOrdering = char.toLower match { - case 'c' => C - case 'f' => Fortran - case _ => - throw new IllegalArgumentException("NDimensional Ordering accepts only 'c' or 'f'.") - } -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/OperatableNDArray.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/OperatableNDArray.scala deleted file mode 100644 index 955bda079..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/OperatableNDArray.scala +++ /dev/null @@ -1,167 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.linalg.api.ndarray.INDArray - -/** - * Scala DSL for arrays - */ -trait OperatableNDArray[A <: INDArray] { - val underlying: A - - // to keep compatibility with Predef.any2stringadd syntax. - def +(that: String): String = underlying.toString() + that - - // --- INDArray operators - def +(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.add(underlying, that) - - def -(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.sub(underlying, that) - - /** element-by-element multiplication */ - def *(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.mul(underlying, that) - - /** matrix multiplication */ - def **(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.mmul(underlying, that) - - /** matrix multiplication using Numpy syntax for arrays */ - def dot(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.mmul(underlying, that) - - def /(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.div(underlying, that) - - /** right division ... is this the correct symbol? */ - def \(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.rdiv(underlying, that) - - // --- In-place INDArray opertors - /** In-place addition */ - def +=(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.addi(underlying, that) - - /** In-place subtraction */ - def -=(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.subi(underlying, that) - - /** In-placeelement-by-element multiplication */ - def *=(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.muli(underlying, that) - - /** In-place matrix multiplication */ - def **=(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.mmuli(underlying, that) - - /** In-place division */ - def /=(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.divi(underlying, that) - - /** In-place right division */ - def \=(that: INDArray)(implicit ev: NDArrayEvidence[A, _]): A = - ev.rdivi(underlying, that) - - // --- Number operators - def +(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.add(underlying, that) - - def -(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.sub(underlying, that) - - def *(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.mul(underlying, that) - - def /(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.div(underlying, that) - - def \(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.rdiv(underlying, that) - - // --- In-place Number operators - def +=(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.addi(underlying, that) - - def -=(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.subi(underlying, that) - - def *=(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.muli(underlying, that) - - def /=(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.divi(underlying, that) - - def %=(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.remainderi(underlying, that) - - /** right division ... is this the correct symbol? */ - def \=(that: Number)(implicit ev: NDArrayEvidence[A, _]): A = - ev.rdivi(underlying, that) - - def get[B](i: Int)(implicit ev: NDArrayEvidence[A, B]): B = - ev.get(underlying, i) - - def get[B](i: Int, j: Int)(implicit ev: NDArrayEvidence[A, B]): B = - ev.get(underlying, i, j) - - def get[B](indices: Int*)(implicit ev: NDArrayEvidence[A, B]): B = - ev.get(underlying, indices.map(_.toLong): _*) - - def apply[B](i: Int)(implicit ev: NDArrayEvidence[A, B]): B = get(i) - - def apply[B](i: Int, j: Int)(implicit ev: NDArrayEvidence[A, B]): B = - get(i, j) - - def apply[B](indices: Int*)(implicit ev: NDArrayEvidence[A, B]): B = - ev.get(underlying, indices.map(_.toLong): _*) - - def get[B](indices: Array[Int])(implicit ev: NDArrayEvidence[A, B]): B = - ev.get(underlying, indices.map(_.toLong): _*) - - def unary_-(): INDArray = underlying.neg() - - def T: INDArray = underlying.transpose() - - def ===(other: Number): INDArray = underlying.eq(other) - - def ===(other: INDArray): INDArray = underlying.eq(other) - - def sumT[B](implicit ev: NDArrayEvidence[A, B]): B = ev.sum(underlying) - - def meanT[B](implicit ev: NDArrayEvidence[A, B]): B = ev.mean(underlying) - - def normMaxT[B](implicit ev: NDArrayEvidence[A, B]): B = - ev.normMax(underlying) - - def norm1T[B](implicit ev: NDArrayEvidence[A, B]): B = ev.norm1(underlying) - - def norm2T[B](implicit ev: NDArrayEvidence[A, B]): B = ev.norm2(underlying) - - def maxT[B](implicit ev: NDArrayEvidence[A, B]): B = ev.max(underlying) - - def minT[B](implicit ev: NDArrayEvidence[A, B]): B = ev.min(underlying) - - def stdT[B](implicit ev: NDArrayEvidence[A, B]): B = - ev.standardDeviation(underlying) - - def prodT[B](implicit ev: NDArrayEvidence[A, B]): B = ev.product(underlying) - - def varT[B]()(implicit ev: NDArrayEvidence[A, B]): B = ev.variance(underlying) -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/RowProjectedNDArray.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/RowProjectedNDArray.scala deleted file mode 100644 index 045276700..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/RowProjectedNDArray.scala +++ /dev/null @@ -1,54 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.indexing.{ NDArrayIndex, SpecifiedIndex } - -class RowProjectedNDArray(val array: INDArray, val filtered: Array[Int]) { - def this(ndarray: INDArray) { - this(ndarray, (0 until ndarray.rows()).toArray) - } - - def mapi(f: INDArray => INDArray): INDArray = { - for { - i <- filtered - } array.putRow(i, f(array.getRow(i))) - array.get(new SpecifiedIndex(filtered: _*), NDArrayIndex.all()) - } - - def map(f: INDArray => INDArray): INDArray = - new RowProjectedNDArray(array.dup(), filtered).mapi(f) - - def flatMap(f: INDArray => INDArray): INDArray = map(f) - - def flatMapi(f: INDArray => INDArray): INDArray = mapi(f) - - def foreach(f: INDArray => Unit): Unit = - for { - i <- filtered - } f(array.getColumn(i)) - - def withFilter(f: INDArray => Boolean): RowProjectedNDArray = { - val targets = for { - i <- filtered - if f(array.getRow(i)) - } yield i - new RowProjectedNDArray(array, targets) - } -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/SliceProjectedNDArray.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/SliceProjectedNDArray.scala deleted file mode 100644 index 28eedf383..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/SliceProjectedNDArray.scala +++ /dev/null @@ -1,54 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.indexing.{ NDArrayIndex, SpecifiedIndex } - -class SliceProjectedNDArray(val array: INDArray, val filtered: Array[Int]) { - def this(ndarray: INDArray) { - this(ndarray, (0 until ndarray.slices().toInt).toArray) - } - - def mapi(f: INDArray => INDArray): INDArray = { - for { - i <- filtered - } array.putSlice(i, f(array.slice(i))) - array.get(new SpecifiedIndex(filtered: _*) +: NDArrayIndex.allFor(array).init: _*) - } - - def map(f: INDArray => INDArray): INDArray = - new SliceProjectedNDArray(array.dup(), filtered).flatMapi(f) - - def flatMap(f: INDArray => INDArray): INDArray = map(f) - - def flatMapi(f: INDArray => INDArray): INDArray = mapi(f) - - def foreach(f: INDArray => Unit): Unit = - for { - i <- filtered - } f(array.slice(i)) - - def withFilter(f: INDArray => Boolean): SliceProjectedNDArray = { - val targets = for { - i <- filtered - if f(array.slice(i)) - } yield i - new SliceProjectedNDArray(array, targets) - } -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/SliceableNDArray.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/SliceableNDArray.scala deleted file mode 100644 index da81a78f9..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/SliceableNDArray.scala +++ /dev/null @@ -1,358 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4s.Implicits._ -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.indexing.{ INDArrayIndex, NDArrayIndex } -import org.slf4j.LoggerFactory - -import _root_.scala.annotation.tailrec - -package object ops { - case object :: extends IndexRange { - override def hasNegative: Boolean = false - } -} - -trait SliceableNDArray[A <: INDArray] { - lazy val log = LoggerFactory.getLogger(classOf[SliceableNDArray[A]]) - val underlying: A - - def apply[B](target: IndexRange*)(implicit ev: NDArrayEvidence[A, B], ev2: Manifest[B]): A = - subMatrix(target: _*)(ev, ev2) - - /* - Extract subMatrix at given position. - */ - def subMatrix[B](target: IndexRange*)(implicit ev: NDArrayEvidence[A, B], ev2: Manifest[B]): A = { - require(target.size <= underlying.shape().length, - "Matrix dimension must be equal or larger than shape's dimension to extract.") - - if (target.exists(_.hasNegative)) { - val SubMatrixIndices(indices, targetShape) = indicesFrom(target: _*) - - val lv = ev.linearView(underlying) - val filtered = indices.map { i => - ev.get(lv, i) - }.toArray - - ev.create(filtered, targetShape, NDOrdering(underlying.ordering()), 0) - - } else { - - ev.get(underlying, getINDArrayIndexfrom(target: _*): _*) - } - } - - def indicesFrom(target: IndexRange*): SubMatrixIndices = { - val originalShape: List[Int] = - if (underlying.isRowVector && underlying.shape().length == 1) - 1 +: underlying.shape().map(_.toInt).toList - else - underlying.shape().map(_.toInt).toList - - val originalTarget = - if (underlying.isRowVector && target.size == 1) - IntRange(0) +: target - else - target - - @tailrec - def modifyTargetIndices(input: List[IndexRange], i: Int, acc: List[DRange]): List[DRange] = input match { - case ops.:: :: t => - modifyTargetIndices(t, i + 1, DRange(0, originalShape(i), 1) :: acc) - case -> :: t => - modifyTargetIndices(t, i + 1, DRange(0, originalShape(i), 1) :: acc) - case ---> :: t => - val ellipsised = List.fill(originalShape.length - i - t.size)(->) - modifyTargetIndices(ellipsised ::: t, i, acc) - case IntRangeFrom(from: Int) :: t => - val max = originalShape(i) - modifyTargetIndices(t, i + 1, DRange(from, max, false, 1, max) :: acc) - case (inr: IndexNumberRange) :: t => - modifyTargetIndices(t, i + 1, inr.asRange(originalShape(i)) :: acc) - - case Nil => - acc.reverse - } - - val modifiedTarget = modifyTargetIndices(originalTarget.toList, 0, Nil) - - val targetShape = modifiedTarget.map(_.length).toArray - - def calcIndices(tgt: List[DRange], stride: List[Int]): List[Int] = { - val indicesOnAxis = (tgt zip stride).collect { - case (range, st) => range.toList.map(_ * st) - } - indicesOnAxis - .reduceLeftOption[List[Int]] { - case (l, r) => - if (underlying.ordering() == NDOrdering.C.value) - l.flatMap { i => - r.map(_ + i) - } else - r.flatMap { i => - l.map(_ + i) - } - } - .getOrElse(List.empty) - } - - val indices = - calcIndices(modifiedTarget.toList, Nd4j.getStrides(originalShape.toArray, underlying.ordering()).toList) - log.trace(s"${target.mkString("[", ",", "]")} means $modifiedTarget at ${originalShape - .mkString("[", "x", s"]${underlying.ordering}")} matrix with stride:${underlying.stride - .mkString(",")}. Target shape:${targetShape - .mkString("[", "x", s"]${underlying.ordering}")} indices:$indices") - SubMatrixIndices(indices, targetShape) - } - - def getINDArrayIndexfrom(target: IndexRange*): List[INDArrayIndex] = { - val originalShape: List[Int] = - if (underlying.isRowVector && underlying.shape().length == 1) - 1 +: underlying.shape().map(_.toInt).toList - else - underlying.shape().map(_.toInt).toList - - val originalTarget = - /*if (underlying.isRowVector && target.size == 1) - IntRange(0) +: target - else*/ - target - - @tailrec - def modifyTargetIndices(input: List[IndexRange], i: Int, acc: List[INDArrayIndex]): List[INDArrayIndex] = - input match { - case -> :: t => - val all = NDArrayIndex.all() - all.init(underlying, i) - modifyTargetIndices(t, i + 1, all :: acc) - case ---> :: t => - val ellipsised = List.fill(originalShape.length - i - t.size)(->) - modifyTargetIndices(ellipsised ::: t, i, acc) - case --- :: t => - val ellipsised = List.fill(originalShape.length - i - t.size)(->) - modifyTargetIndices(ellipsised ::: t, i, acc) - case IntRangeFrom(from: Int) :: t => - val max = originalShape(i) - modifyTargetIndices(t, i + 1, IndexNumberRange.toNDArrayIndex(from, max, false, 1, max) :: acc) - case (inr: IndexNumberRange) :: t => - modifyTargetIndices(t, i + 1, inr.asNDArrayIndex(originalShape(i).toInt) :: acc) - - case Nil => - acc.reverse - } - - val modifiedTarget = modifyTargetIndices(originalTarget.toList, 0, Nil) - log.trace(s"${target.mkString("[", ",", "]")} means $modifiedTarget at ${originalShape - .mkString("[", "x", s"]${underlying.ordering}")} matrix with stride:${underlying.stride - .mkString(",")}.") - modifiedTarget - } - - def update[T, T1](indices: Array[IndexRange], num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, indices, num) - def update[T, T1](i1: IndexRange, num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, Array(i1), num) - def update[T, T1](i1: IndexRange, i2: IndexRange, num: T)(implicit ev: NDArrayEvidence[A, T1], - ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2), num) - def update[T, T1](i1: IndexRange, i2: IndexRange, i3: IndexRange, num: T)(implicit ev: NDArrayEvidence[A, T1], - ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2, i3), num) - def update[T, T1](i1: IndexRange, i2: IndexRange, i3: IndexRange, i4: IndexRange, num: T)( - implicit ev: NDArrayEvidence[A, T1], - ev2: T => T1 - ): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4), num) - def update[T, T1](i1: IndexRange, i2: IndexRange, i3: IndexRange, i4: IndexRange, i5: IndexRange, num: T)( - implicit ev: NDArrayEvidence[A, T1], - ev2: T => T1 - ): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4, i5), num) - def update[T, T1](i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6), num) - def update[T, T1](i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7), num) - def update[T, T1](i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8), num) - def update[T, T1](i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - i9: IndexRange, - num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8, i9), num) - def update[T, T1](i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - i9: IndexRange, - i10: IndexRange, - num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8, i9, i10), num) - def update[T, T1](i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - i9: IndexRange, - i10: IndexRange, - i11: IndexRange, - num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8, i9, i10, i11), num) - def update[T, T1](i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - i9: IndexRange, - i10: IndexRange, - i11: IndexRange, - i12: IndexRange, - num: T)(implicit ev: NDArrayEvidence[A, T1], ev2: T => T1): INDArray = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8, i9, i10, i11, i12), num) - - def update(indices: Array[IndexRange], num: A)(implicit ev: NDArrayEvidence[A, _]): INDArray = - ev.update(underlying, indices, num) - def update(i1: IndexRange, num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1), num) - def update(i1: IndexRange, i2: IndexRange, num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1, i2), num) - def update(i1: IndexRange, i2: IndexRange, i3: IndexRange, num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1, i2, i3), num) - def update(i1: IndexRange, i2: IndexRange, i3: IndexRange, i4: IndexRange, num: A)( - implicit ev: NDArrayEvidence[A, _] - ): A = - ev.update(underlying, Array(i1, i2, i3, i4), num) - def update(i1: IndexRange, i2: IndexRange, i3: IndexRange, i4: IndexRange, i5: IndexRange, num: A)( - implicit ev: NDArrayEvidence[A, _] - ): A = - ev.update(underlying, Array(i1, i2, i3, i4, i5), num) - def update(i1: IndexRange, i2: IndexRange, i3: IndexRange, i4: IndexRange, i5: IndexRange, i6: IndexRange, num: A)( - implicit ev: NDArrayEvidence[A, _] - ): A = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6), num) - def update(i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7), num) - def update(i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8), num) - def update(i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - i9: IndexRange, - num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8, i9), num) - def update(i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - i9: IndexRange, - i10: IndexRange, - num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8, i9, i10), num) - def update(i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - i9: IndexRange, - i10: IndexRange, - i11: IndexRange, - num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8, i9, i10, i11), num) - def update(i1: IndexRange, - i2: IndexRange, - i3: IndexRange, - i4: IndexRange, - i5: IndexRange, - i6: IndexRange, - i7: IndexRange, - i8: IndexRange, - i9: IndexRange, - i10: IndexRange, - i11: IndexRange, - i12: IndexRange, - num: A)(implicit ev: NDArrayEvidence[A, _]): A = - ev.update(underlying, Array(i1, i2, i3, i4, i5, i6, i7, i8, i9, i10, i11, i12), num) -} -case class SubMatrixIndices(indices: List[Int], targetShape: Array[Int]) diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/BitFilterOps.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/BitFilterOps.scala deleted file mode 100644 index cc738d904..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/BitFilterOps.scala +++ /dev/null @@ -1,67 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.ops - -import org.nd4j.autodiff.samediff.SDVariable -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.api.ops.BaseScalarOp -import org.nd4s.Implicits._ - -object BitFilterOps { - def apply(x: INDArray, f: Double => Boolean): BitFilterOps = - new BitFilterOps(x, x.length().toInt, f) -} - -class BitFilterOps(_x: INDArray, len: Int, f: Double => Boolean) - extends BaseScalarOp(_x, null: INDArray, _x, 0) - with LeftAssociativeBinaryOp { - - def this() { - this(0.toScalar, 0, null) - } - - x = _x - - override def opNum(): Int = -1 - - override def opName(): String = "bitfilter_scalar" - - override def onnxName(): String = throw new UnsupportedOperationException - - override def tensorflowName(): String = - throw new UnsupportedOperationException - - override def doDiff(f1: java.util.List[SDVariable]): java.util.List[SDVariable] = - throw new UnsupportedOperationException - -// override def opForDimension(index: Int, dimension: Int): Op = BitFilterOps(x.tensorAlongDimension(index,dimension),f,g) -// -// override def opForDimension(index: Int, dimension: Int*): Op = BitFilterOps(x.tensorAlongDimension(index,dimension:_*),f,g) - - override def op(origin: Double): Double = if (f(origin)) 1 else 0 - - override def op(origin: Float): Float = if (f(origin)) 1 else 0 - - override def op(origin: Short): Short = if (f(origin)) 1 else 0 - - override def op(origin: Int): Int = if (f(origin)) 1 else 0 - - override def op(origin: Long): Long = if (f(origin)) 1 else 0 - - override def op(origin: String): String = ??? -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/FilterOps.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/FilterOps.scala deleted file mode 100644 index 49df3c118..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/FilterOps.scala +++ /dev/null @@ -1,67 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.ops - -import org.nd4j.autodiff.samediff.SDVariable -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.api.ops.BaseScalarOp -import org.nd4s.Implicits._ - -object FilterOps { - def apply(x: INDArray, f: Double => Boolean): FilterOps = - new FilterOps(x, x.length().toInt, f) -} -class FilterOps(_x: INDArray, len: Int, f: Double => Boolean) - extends BaseScalarOp(_x, null: INDArray, _x, 0) - with LeftAssociativeBinaryOp { - - def this() { - this(0.toScalar, 0, null) - } - - x = _x - - override def opNum(): Int = -1 - - override def opName(): String = "filter_scalar" - - override def onnxName(): String = throw new UnsupportedOperationException - - override def tensorflowName(): String = - throw new UnsupportedOperationException - - override def doDiff(f1: java.util.List[SDVariable]): java.util.List[SDVariable] = - throw new UnsupportedOperationException - -// override def opForDimension(index: Int, dimension: Int): Op = FilterOps(x.tensorAlongDimension(index,dimension),f,g) -// -// override def opForDimension(index: Int, dimension: Int*): Op = FilterOps(x.tensorAlongDimension(index,dimension:_*),f,g) - - override def op(origin: Double): Double = if (f(origin)) origin else 0 - - override def op(origin: Float): Float = if (f(origin)) origin else 0 - - override def op(origin: Short): Short = if (f(origin)) origin else 0 - - override def op(origin: Int): Int = if (f(origin)) origin else 0 - - override def op(origin: Long): Long = if (f(origin)) origin else 0 - - override def op(origin: String): String = ??? - -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/FunctionalOpExecutioner.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/FunctionalOpExecutioner.scala deleted file mode 100644 index e501ad8a1..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/FunctionalOpExecutioner.scala +++ /dev/null @@ -1,542 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.ops - -import java.util.{ List, Map, Properties } - -import org.bytedeco.javacpp.Pointer -import org.nd4j.linalg.api.buffer.{ DataBuffer, DataType } -import org.nd4j.linalg.api.environment.Nd4jEnvironment -import org.nd4j.linalg.api.ndarray.{ INDArray, INDArrayStatistics } -import org.nd4j.linalg.api.ops.aggregates.{ Aggregate, Batch } -import org.nd4j.linalg.api.ops._ -import org.nd4j.linalg.api.ops.executioner.OpExecutioner -import org.nd4j.linalg.api.ops.impl.scatter.ScatterUpdate -import org.nd4j.linalg.api.ops.impl.summarystats.Variance -import org.nd4j.linalg.api.rng.Random -import org.nd4j.linalg.api.shape.{ LongShapeDescriptor, TadPack } -import org.nd4j.linalg.cache.TADManager -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.profiler.ProfilerConfig -import org.slf4j.{ Logger, LoggerFactory } - -object FunctionalOpExecutioner { - def apply: FunctionalOpExecutioner = new FunctionalOpExecutioner() -} -class FunctionalOpExecutioner extends OpExecutioner { - - def log: Logger = LoggerFactory.getLogger(FunctionalOpExecutioner.getClass) - - private[this] var verboseEnabled: Boolean = false - - def isVerbose: Boolean = verboseEnabled - - def enableVerboseMode(reallyEnable: Boolean): Unit = - verboseEnabled = reallyEnable - - /** - * This method returns true if debug mode is enabled, false otherwise - * - * @return - */ - private[this] var debugEnabled: Boolean = false - - def isDebug: Boolean = debugEnabled - - def enableDebugMode(reallyEnable: Boolean): Unit = - debugEnabled = reallyEnable - - /** - * This method returns type for this executioner instance - * - * @return - */ - def `type`: OpExecutioner.ExecutionerType = ??? - - /** - * This method returns opName of the last invoked op - * - * @return - */ - def getLastOp: String = ??? - - /** - * Execute the operation - * - * @param op the operation to execute - */ - def exec(op: Op): INDArray = - op match { - case op: FilterOps => exec(op.asInstanceOf[FilterOps]) - case op: BitFilterOps => exec(op.asInstanceOf[BitFilterOps]) - case op: MapOps => exec(op.asInstanceOf[MapOps]) - case _ => op.z() - } - - def exec(op: Op, context: OpContext): INDArray = - Nd4j.getExecutioner.exec(op, context) - - def exec(op: FilterOps): INDArray = { - val retVal: INDArray = Nd4j.create(op.x.dataType(), op.x.shape().map(_.toLong): _*) - for (i <- 0 until op.x().length().toInt) { - val filtered = op.x.dataType() match { - case DataType.DOUBLE => op.op(op.x.getDouble(i.toLong)) - case DataType.FLOAT => op.op(op.x.getFloat(i.toLong)) - case DataType.INT => op.op(op.x.getInt(i)) - case DataType.SHORT => op.op(op.x.getInt(i)) - case DataType.LONG => op.op(op.x.getLong(i.toLong)) - } - retVal.putScalar(i, filtered) - } - retVal - } - - def exec(op: BitFilterOps): INDArray = { - val retVal: INDArray = Nd4j.create(op.x.dataType(), op.x.shape().map(_.toLong): _*) - for (i <- 0 until op.x().length().toInt) { - val current = if (op.x.dataType() == DataType.DOUBLE) op.x().getDouble(i.toLong) else op.x().getInt(i) - val filtered = op.op(current) - - retVal.putScalar(i, filtered) - } - retVal - } - - def exec(op: MapOps): INDArray = { - val retVal: INDArray = Nd4j.create(op.x.dataType(), op.x.shape().map(_.toLong): _*) - for (i <- 0 until op.x().length().toInt) { - val current = if (op.x.dataType() == DataType.DOUBLE) op.x().getDouble(i.toLong) else op.x().getInt(i) - val filtered = op.op(current) - - retVal.putScalar(i, filtered) - } - retVal - } - - /** Execute a TransformOp and return the result - * - * @param op the operation to execute - */ - def execAndReturn(op: TransformOp): TransformOp = - Nd4j.getExecutioner.execAndReturn(op) - - /** - * Execute and return the result from an accumulation - * - * @param op the operation to execute - * @return the accumulated result - */ - def execAndReturn(op: ReduceOp): ReduceOp = - Nd4j.getExecutioner.execAndReturn(op) - - def execAndReturn(op: Variance): Variance = - Nd4j.getExecutioner.execAndReturn(op) - - /** Execute and return the result from an index accumulation - * - * @param op the index accumulation operation to execute - * @return the accumulated index - */ - def execAndReturn(op: IndexAccumulation): IndexAccumulation = - Nd4j.getExecutioner.execAndReturn(op) - - /** Execute and return the result from a scalar op - * - * @param op the operation to execute - * @return the accumulated result - */ - def execAndReturn(op: ScalarOp): ScalarOp = - Nd4j.getExecutioner.execAndReturn(op) - - /** Execute and return the result from a vector op - * - * @param op */ - def execAndReturn(op: BroadcastOp): BroadcastOp = - Nd4j.getExecutioner.execAndReturn(op) - - /** - * Execute a reduceOp, possibly along one or more dimensions - * - * @param reduceOp the reduceOp - * @return the reduceOp op - */ - def exec(reduceOp: ReduceOp): INDArray = - Nd4j.getExecutioner.exec(reduceOp) - - /** - * Execute a broadcast op, possibly along one or more dimensions - * - * @param broadcast the accumulation - * @return the broadcast op - */ - def exec(broadcast: BroadcastOp): INDArray = - Nd4j.getExecutioner.exec(broadcast) - - /** - * Execute ScalarOp - * - * @param broadcast - * @return - */ - def exec(broadcast: ScalarOp): INDArray = - Nd4j.exec(broadcast) - - /** - * Execute an variance accumulation op, possibly along one or more dimensions - * - * @param accumulation the accumulation - * @return the accmulation op - */ - def exec(accumulation: Variance): INDArray = - Nd4j.getExecutioner.exec(accumulation) - - /** Execute an index accumulation along one or more dimensions - * - * @param indexAccum the index accumulation operation - * @return result - */ - def exec(indexAccum: IndexAccumulation): INDArray = - Nd4j.getExecutioner.exec(indexAccum) - - /** - * - * Execute and return a result - * ndarray from the given op - * - * @param op the operation to execute - * @return the result from the operation - */ - def execAndReturn(op: Op): Op = - Nd4j.getExecutioner.execAndReturn(op) - - /** - * Execute MetaOp - * - * @param op - */ - def exec(op: MetaOp): Unit = - Nd4j.getExecutioner.exec(op) - - /** - * Execute GridOp - * - * @param op - */ - def exec(op: GridOp): Unit = - Nd4j.getExecutioner.exec(op) - - /** - * - * @param op - */ - def exec(op: Aggregate): Unit = - Nd4j.getExecutioner.exec(op) - - /** - * This method executes previously built batch - * - * @param batch - */ - def exec[T <: Aggregate](batch: Batch[T]): Unit = - Nd4j.getExecutioner.exec(batch) - - /** - * This method takes arbitrary sized list of aggregates, - * and packs them into batches - * - * @param batch - */ - def exec(batch: java.util.List[Aggregate]): Unit = - Nd4j.getExecutioner.exec(batch) - - /** - * This method executes specified RandomOp using default RNG available via Nd4j.getRandom() - * - * @param op - */ - def exec(op: RandomOp): INDArray = - Nd4j.getExecutioner.exec(op) - - /** - * This method executes specific RandomOp against specified RNG - * - * @param op - * @param rng - */ - def exec(op: RandomOp, rng: Random): INDArray = - Nd4j.getExecutioner.exec(op, rng) - - /** - * This method return set of key/value and - * key/key/value objects, - * describing current environment - * - * @return - */ - def getEnvironmentInformation: Properties = - Nd4j.getExecutioner.getEnvironmentInformation - - /** - * This method specifies desired profiling mode - * - * @param mode - */ - @deprecated def setProfilingMode(mode: OpExecutioner.ProfilingMode): Unit = ??? - - /** - * This method stores specified configuration. - * - * @param config - */ - def setProfilingConfig(config: ProfilerConfig): Unit = - Nd4j.getExecutioner.setProfilingConfig(config) - - /** - * Ths method returns current profiling - * - * @return - */ - @deprecated def getProfilingMode: OpExecutioner.ProfilingMode = ??? - - /** - * This method returns TADManager instance used for this OpExecutioner - * - * @return - */ - def getTADManager: TADManager = - Nd4j.getExecutioner.getTADManager - - /** - * This method prints out environmental information returned by getEnvironmentInformation() method - */ - def printEnvironmentInformation(): Unit = - Nd4j.getExecutioner.printEnvironmentInformation() - - /** - * This method ensures all operations that supposed to be executed at this moment, are executed. - */ - def push(): Unit = ??? - - /** - * This method ensures all operations that supposed to be executed at this moment, are executed and finished. - */ - def commit(): Unit = ??? - - /** - * This method encodes array as thresholds, updating input array at the same time - * - * @param input - * @return encoded array is returned - */ - def thresholdEncode(input: INDArray, threshold: Double): INDArray = ??? - - def thresholdEncode(input: INDArray, threshold: Double, boundary: Integer): INDArray = ??? - - /** - * This method decodes thresholds array, and puts it into target array - * - * @param encoded - * @param target - * @return target is returned - */ - def thresholdDecode(encoded: INDArray, target: INDArray): INDArray = ??? - - /** - * This method returns number of elements affected by encoder - * - * @param indArray - * @param target - * @param threshold - * @return - */ - def bitmapEncode(indArray: INDArray, target: INDArray, threshold: Double): Long = ??? - - /** - * - * @param indArray - * @param threshold - * @return - */ - def bitmapEncode(indArray: INDArray, threshold: Double): INDArray = ??? - - /** - * - * @param encoded - * @param target - * @return - */ - def bitmapDecode(encoded: INDArray, target: INDArray): INDArray = ??? - - /** - * This method returns names of all custom operations available in current backend, and their number of input/output arguments - * - * @return - */ - def getCustomOperations: java.util.Map[String, CustomOpDescriptor] = ??? - - /** - * This method executes given CustomOp - * - * PLEASE NOTE: You're responsible for input/output validation - * - * @param op - */ - def execAndReturn(op: CustomOp): CustomOp = ??? - - def exec(op: CustomOp): Array[INDArray] = ??? - - /** - * This method executes op with given context - * - * @param op - * @param context - * @return method returns output arrays defined within context - */ - def exec(op: CustomOp, context: OpContext): Array[INDArray] = - Nd4j.getExecutioner.exec(op, context) - - def calculateOutputShape(op: CustomOp): java.util.List[LongShapeDescriptor] = - Nd4j.getExecutioner.calculateOutputShape(op) - - def calculateOutputShape(op: CustomOp, ctx: OpContext): java.util.List[LongShapeDescriptor] = - Nd4j.getExecutioner.calculateOutputShape(op, ctx) - - /** - * Equivalent to calli - */ - def allocateOutputArrays(op: CustomOp): Array[INDArray] = - Nd4j.getExecutioner.allocateOutputArrays(op) - - def isExperimentalMode: Boolean = true - - def registerGraph(id: Long, graph: Pointer): Unit = ??? - - def executeGraph(id: Long, - map: java.util.Map[String, INDArray], - reverseMap: java.util.Map[String, Integer]): java.util.Map[String, INDArray] = ??? - - def forgetGraph(id: Long): Unit = ??? - - /** - * This method allows to set desired number of elements per thread, for performance optimization purposes. - * I.e. if array contains 2048 elements, and threshold is set to 1024, 2 threads will be used for given op execution. - * - * Default value: 1024 - * - * @param threshold - */ - def setElementsThreshold(threshold: Int): Unit = ??? - - /** - * This method allows to set desired number of sub-arrays per thread, for performance optimization purposes. - * I.e. if matrix has shape of 64 x 128, and threshold is set to 8, each thread will be processing 8 sub-arrays (sure, if you have 8 core cpu). - * If your cpu has, say, 4, cores, only 4 threads will be spawned, and each will process 16 sub-arrays - * - * Default value: 8 - * - * @param threshold - */ - def setTadThreshold(threshold: Int): Unit = ??? - - /** - * This method extracts String from Utf8Buffer - * - * @param buffer - * @param index - * @return - */ - def getString(buffer: DataBuffer, index: Long): String = ??? - - /** - * This method returns OpContext which can be used (and reused) to execute custom ops - * - * @return - */ - def buildContext: OpContext = ??? - - /** - * - * @param array - */ - def inspectArray(array: INDArray): INDArrayStatistics = ??? - - /** - * This method returns shapeInfo DataBuffer - * - * @param shape - * @param stride - * @param elementWiseStride - * @param order - * @param dtype - * @return - */ - def createShapeInfo(shape: Array[Long], - stride: Array[Long], - elementWiseStride: Long, - order: Char, - dtype: DataType, - empty: Boolean): DataBuffer = ??? - - /** - * This method returns shapeInfo DataBuffer - * - * @param shape - * @param stride - * @param elementWiseStride - * @param order - * @param dtype - * @return - */ - def createShapeInfo(shape: Array[Long], - stride: Array[Long], - elementWiseStride: Long, - order: Char, - dtype: DataType, - extras: Long): DataBuffer = ??? - - /** - * This method returns host/device tad buffers - */ - def tadShapeInfoAndOffsets(array: INDArray, dimension: Array[Int]): TadPack = ??? - - /** - * This method returns constant buffer for the given jvm array - * - * @param values - * @return - */ - def createConstantBuffer(values: Array[Long], desiredType: DataType): DataBuffer = ??? - - def createConstantBuffer(values: Array[Int], desiredType: DataType): DataBuffer = ??? - - def createConstantBuffer(values: Array[Float], desiredType: DataType): DataBuffer = ??? - - def createConstantBuffer(values: Array[Double], desiredType: DataType): DataBuffer = ??? - - def runFullBenchmarkSuit(x: Boolean): String = - Nd4j.getExecutioner.runFullBenchmarkSuit(x) - - def runLightBenchmarkSuit(x: Boolean): String = - Nd4j.getExecutioner.runLightBenchmarkSuit(x) - - @deprecated def scatterUpdate(op: ScatterUpdate.UpdateOp, - array: INDArray, - indices: INDArray, - updates: INDArray, - axis: Array[Int]): Unit = ??? -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/LeftAssociativeBinaryOp.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/LeftAssociativeBinaryOp.scala deleted file mode 100644 index 64bec103e..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/LeftAssociativeBinaryOp.scala +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.ops - -import org.nd4s.Implicits._ - -trait LeftAssociativeBinaryOp { - -// def op(origin: IComplexNumber, other: Double): IComplexNumber = op(origin) -// -// def op(origin: IComplexNumber, other: Float): IComplexNumber = op(origin) -// -// def op(origin: IComplexNumber, other: IComplexNumber): IComplexNumber = -// op(origin) - - def op(origin: Float, other: Float): Float = op(origin) - - def op(origin: Double, other: Double): Double = op(origin) - - def op(origin: Double): Double - - def op(origin: Float): Float - - def op(origin: Short): Short - - def op(origin: Int): Int - - def op(origin: Long): Long - - def op(origin: String): String - -// def op(origin: IComplexNumber): IComplexNumber -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/MapOps.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/MapOps.scala deleted file mode 100644 index b11e180fa..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/ops/MapOps.scala +++ /dev/null @@ -1,61 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.ops - -import org.nd4j.autodiff.samediff.SDVariable -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.api.ops.BaseScalarOp -import org.nd4s.Implicits._ - -object MapOps { - def apply(x: INDArray, f: Double => Double): MapOps = new MapOps(x, f) -} -class MapOps(_x: INDArray, f: Double => Double) extends BaseScalarOp(_x, null, _x, 0) with LeftAssociativeBinaryOp { - x = _x - def this() { - this(0.toScalar, null) - } - - override def opNum(): Int = -1 - - override def opName(): String = "map_scalar" - - override def onnxName(): String = throw new UnsupportedOperationException - - override def tensorflowName(): String = - throw new UnsupportedOperationException - - override def doDiff(f1: java.util.List[SDVariable]): java.util.List[SDVariable] = - throw new UnsupportedOperationException - -// override def opForDimension(index: Int, dimension: Int): Op = MapOps(x.tensorAlongDimension(index,dimension),f,g) -// -// override def opForDimension(index: Int, dimension: Int*): Op = MapOps(x.tensorAlongDimension(index,dimension:_*),f,g) - - override def op(origin: Double): Double = f(origin) - - override def op(origin: Float): Float = f(origin).toFloat - - override def op(origin: Short): Short = f(origin).toShort - - override def op(origin: Int): Int = f(origin).toInt - - override def op(origin: Long): Long = f(origin).toLong - - override def op(origin: String): String = ??? -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/samediff/SameDiff.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/samediff/SameDiff.scala deleted file mode 100644 index 78cafd235..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/samediff/SameDiff.scala +++ /dev/null @@ -1,191 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.samediff - -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.autodiff.samediff.{ SDIndex, SDVariable, SameDiff } -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.factory.Nd4j - -/** - * Provides wrappers for nd4j SameDiff and related classes. - * - * Wrappers are designed to be used implicitly, client code - * should be similar to nd4j with additional syntactic sugar - * and Scala specific stuff. - * - * @author Alexander Stoyakin - */ -class SameDiffWrapper { - - var sd: SameDiff = SameDiff.create() - - def this(sd: SameDiff) { - this - this.sd = sd - } - - def bind(name: String, data: INDArray): SDVariable = - sd.`var`(name, data) - - def bind(name: String, dataType: DataType, shape: Array[Long]): SDVariable = - sd.`var`(name, dataType, shape: _*) - - def bind(data: INDArray): SDVariable = - sd.`var`("", data) - - def bind(name: String, dataType: DataType, shape: Array[Int]): SDVariable = - sd.`var`(name, dataType, shape: _*) - - def placeHolder(name: String, dataType: DataType, shape: Long*): SDVariable = - sd.placeHolder(name, dataType, shape: _*) -} - -case class SDIndexWrapper(end: Long) { - - def ::(start: Long): SDIndex = - SDIndex.interval(start, end) -} - -case class SDIndexWrapper1(start: Int) { - - def ::(end: Int): SDIndex = - SDIndex.interval(start.toLong, end.toLong) -} - -object --- extends SDIndex { - val thisIndex: SDIndex = SDIndex.all() -} - -class SDVariableWrapper { - - var thisVariable: SDVariable = null - var isScalar: Boolean = false - val --- : SDIndex = SDIndex.all() - - def this(variable: SDVariable) { - this - thisVariable = variable - } - - // Indexing - def apply(index: Long): SDVariable = thisVariable.get(SDIndex.point(index)) - - def apply(index: SDIndex*): SDVariable = thisVariable.get(index: _*) - - // Arithmetic - def add(other: Double): Unit = thisVariable.add(other) - - def *(other: SDVariable): SDVariable = - thisVariable.mul(other) - - def +(other: SDVariable): SDVariable = - thisVariable.add(other) - - def /(other: SDVariable): SDVariable = - if (isScalar) - thisVariable.rdiv(other) - else - thisVariable.rdiv(other) - - def -(other: SDVariable): SDVariable = - if (isScalar) - thisVariable.rsub(other) - else - thisVariable.sub(other) - - def %(other: SDVariable): SDVariable = thisVariable.mod(null, other) - - def `//`(other: SDVariable): SDVariable = thisVariable.fdiv(null, other) - - def unary_-(): SDVariable = thisVariable.neg - - def ^(other: SDVariable)(implicit sameDiff: SameDiff): SDVariable = sameDiff.math.xor(thisVariable, other) - def |(other: SDVariable)(implicit sameDiff: SameDiff): SDVariable = sameDiff.math.or(thisVariable, other) - def &(other: SDVariable)(implicit sameDiff: SameDiff): SDVariable = sameDiff.math.and(thisVariable, other) - - def <<(other: SDVariable)(implicit sameDiff: SameDiff): SDVariable = sameDiff.math.bitShift(null, thisVariable, other) - def >>(other: SDVariable)(implicit sameDiff: SameDiff): SDVariable = - sameDiff.math.bitShiftRight(null, thisVariable, other) - def <<(x: Int)(implicit sameDiff: SameDiff): SDVariable = - sameDiff.math.bitShift(null, thisVariable, sameDiff.constant(x)) - def >>(x: Int)(implicit sameDiff: SameDiff): SDVariable = - sameDiff.math.bitShiftRight(null, thisVariable, sameDiff.constant(x)) - - // Overloads for numeric arguments - // Float - def *(other: Float)(implicit sameDiff: SameDiff): SDVariable = - thisVariable.mul(sameDiff.constant(other)) - - def +(other: Float)(implicit sameDiff: SameDiff): SDVariable = - thisVariable.add(sameDiff.constant(other)) - - def -(other: Float)(implicit sameDiff: SameDiff): SDVariable = - if (isScalar) - thisVariable.rsub(sameDiff.constant(other)) - else - thisVariable.sub(sameDiff.constant(other)) - - def /(other: Float)(implicit sameDiff: SameDiff): SDVariable = - if (isScalar) - thisVariable.rdiv(sameDiff.constant(other)) - else - thisVariable.div(sameDiff.constant(other)) - - def %(other: Float)(implicit sameDiff: SameDiff): SDVariable = - thisVariable.mod(null, sameDiff.constant(other)) - - def `//`(other: Float)(implicit sameDiff: SameDiff): SDVariable = - thisVariable.fdiv(null, sameDiff.constant(other)) - - //Double - def *(other: Double)(implicit sameDiff: SameDiff): SDVariable = - thisVariable.mul(sameDiff.constant(other)) - - def +(other: Double)(implicit sameDiff: SameDiff): SDVariable = - thisVariable.add(sameDiff.constant(other)) - - def -(other: Double)(implicit sameDiff: SameDiff): SDVariable = - if (isScalar) - thisVariable.rsub(sameDiff.constant(other)) - else - thisVariable.sub(sameDiff.constant(other)) - - def /(other: Double)(implicit sameDiff: SameDiff): SDVariable = - if (isScalar) - thisVariable.rdiv(sameDiff.constant(other)) - else - thisVariable.div(sameDiff.constant(other)) - - def %(other: Double)(implicit sameDiff: SameDiff): SDVariable = - thisVariable.mod(null, sameDiff.constant(other)) - - def `//`(other: Double)(implicit sameDiff: SameDiff): SDVariable = - thisVariable.fdiv(null, sameDiff.constant(other)) - - // Int - def **(x: Int): SDVariable = - thisVariable.pow(x) - - def ^(other: Boolean)(implicit sameDiff: SameDiff): SDVariable = - sameDiff.math.xor(thisVariable, sameDiff.constant(Nd4j.scalar(other))) - def |(other: Boolean)(implicit sameDiff: SameDiff): SDVariable = - sameDiff.math.or(thisVariable, sameDiff.constant(Nd4j.scalar(other))) - def &(other: Boolean)(implicit sameDiff: SameDiff): SDVariable = - sameDiff.math.and(thisVariable, sameDiff.constant(Nd4j.scalar(other))) -} diff --git a/contrib/attic/nd4s/src/main/scala/org/nd4s/samediff/implicits/Implicits.scala b/contrib/attic/nd4s/src/main/scala/org/nd4s/samediff/implicits/Implicits.scala deleted file mode 100644 index 4e2fb87bd..000000000 --- a/contrib/attic/nd4s/src/main/scala/org/nd4s/samediff/implicits/Implicits.scala +++ /dev/null @@ -1,64 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.samediff.implicits - -import org.nd4j.autodiff.samediff.{ SDIndex, SDVariable, SameDiff } -import org.nd4j.linalg.factory.Nd4j -import org.nd4s.samediff.{ SDIndexWrapper, SDVariableWrapper, SameDiffWrapper } - -object Implicits { - implicit def SameDiffToWrapper(sd: SameDiff): SameDiffWrapper = - new SameDiffWrapper(sd) - - implicit def SDVariableToWrapper(variable: SDVariable): SDVariableWrapper = - new SDVariableWrapper(variable) - - implicit def FloatToSDVariable(x: Float)(implicit sd: SameDiff): SDVariableWrapper = { - val result = new SDVariableWrapper(sd.constant(x)) - result.isScalar = true - result - } - - implicit def DoubleToSDVariable(x: Double)(implicit sd: SameDiff): SDVariableWrapper = { - val result = new SDVariableWrapper(sd.constant(x)) - result.isScalar = true - result - } - - implicit def BooleanToSDVariable(x: Boolean)(implicit sd: SameDiff): SDVariableWrapper = { - val result = new SDVariableWrapper(sd.constant(Nd4j.scalar(x))) - result.isScalar = true - result - } - - implicit def RangeToWrapper(start: Long): SDIndexWrapper = { - val result = new SDIndexWrapper(start) - result - } - - implicit def LongToPoint(x: Long): SDIndex = - SDIndex.point(x) - - implicit def IntRangeToWrapper(start: Int): SDIndexWrapper = { - val result = new SDIndexWrapper(start) - result - } - - implicit def IntToPoint(x: Int): SDIndex = - SDIndex.point(x) -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/BreezeCheck.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/BreezeCheck.scala deleted file mode 100644 index f19871acd..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/BreezeCheck.scala +++ /dev/null @@ -1,82 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import breeze.linalg._ -import monocle.Prism -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.factory.Nd4j -import org.nd4s.Implicits._ -import org.scalacheck.{ Arbitrary, Gen, Prop } -import org.scalatest.FlatSpec -import org.scalatest.prop.Checkers - -/** - * Created by taisukeoe on 16/03/05. - */ -class BreezeCheck extends FlatSpec with Checkers { - it should "work the same as NDArray slicing" in { - check { - Prop.forAll { (ndArray: INDArray) => - ndArray.setOrder('f') - val shape = ndArray.shape().map(_.toInt) - val Array(row, col) = shape - Prop.forAll(Gen.choose(0, row - 1), Gen.choose(0, row - 1), Gen.choose(0, col - 1), Gen.choose(0, col - 1)) { - (r1, r2, c1, c2) => - val rowRange = if (r1 > r2) r2 to r1 else r1 to r2 - val columnRange = if (c1 > c2) c2 to c1 else c1 to c2 - val slicedByND4S = ndArray(rowRange, columnRange) - val slicedByBreeze = prism - .getOption(ndArray) - .map(dm => prism.reverseGet(dm(rowRange, columnRange))) - slicedByBreeze.exists(_.shape() sameElements slicedByND4S.castTo(DataType.DOUBLE).shape()) - } - } - - } - } - - //This supports only real value since ND4J drops complex number support temporary. - lazy val prism = Prism[INDArray, DenseMatrix[Double]] { ndArray => - //Breeze DenseMatrix doesn't support tensor nor C order matrix. - if (ndArray.rank() > 2 || ndArray.ordering() == 'c') - None - else { - val shape = ndArray.shape() - val linear = ndArray.reshape(-1) - val arr = (0 until ndArray.length().toInt).map(i => linear.getDouble(i.toLong)).toArray - Some(DenseMatrix(arr).reshape(shape(0).toInt, shape(1).toInt)) - } - } { dm => - val shape = Array(dm.rows, dm.cols) - dm.toArray.mkNDArray(shape, NDOrdering.Fortran) - } - - implicit def arbNDArray: Arbitrary[INDArray] = Arbitrary { - for { - rows <- Gen.choose(1, 100) - columns <- Gen.choose(1, 100) - } yield { - val nd = Nd4j.rand(rows, columns) - nd.setOrder('f') - nd - } - } - -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/DSLSpec.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/DSLSpec.scala deleted file mode 100644 index 245a1334f..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/DSLSpec.scala +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.junit.runner.RunWith -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.factory.Nd4j -import org.scalatest.junit.JUnitRunner -import org.scalatest.{ FlatSpec, Matchers } -import org.nd4s.Implicits._ - -@RunWith(classOf[JUnitRunner]) -class DSLSpec extends FlatSpec with Matchers { - - "DSL" should "wrap and extend an INDArray" in { - - // This test just verifies that an INDArray gets wrapped with an implicit conversion - - val nd = Nd4j.create(Array[Float](1, 2), Array(2, 1)) - val nd1 = nd + 10L // + creates new array, += modifies in place - - nd.get(0) should equal(1) - nd1.get(0) should equal(11) - - val nd2 = nd += 100 - nd2 should equal(nd) - nd2.get(0) should equal(101) - - // Verify that we are working with regular old INDArray objects - nd2 match { - case i: INDArray => // do nothing - case _ => fail("Expect our object to be an INDArray") - } - - } - - "DSL" should "not prevent Map[Int,T] creation" in { - Map(0 -> "hello") shouldBe a[Map[_, _]] - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayCollectionAPITest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayCollectionAPITest.scala deleted file mode 100644 index 323e7ff54..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayCollectionAPITest.scala +++ /dev/null @@ -1,206 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4s.Implicits._ -import org.nd4s.ops.FunctionalOpExecutioner -import org.scalatest.{ FlatSpec, Matchers } - -class NDArrayCollectionAPITest extends FlatSpec with Matchers { - "CollectionLikeNDArray" should "provides filter API" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - - val filtered = ndArray.filter(_ > 3) - - assert( - filtered == - Array( - Array(0, 0, 0), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - ) - } - - "CollectionLikeNDArray from Floats" should "provides filter API" in { - val ndArray = - Array( - Array(1f, 2f, 3f), - Array(4f, 5f, 6f), - Array(7f, 8f, 9f) - ).toNDArray - - val filtered = ndArray.filter(_ > 3) - - assert( - filtered == - Array( - Array(0f, 0f, 0f), - Array(4f, 5f, 6f), - Array(7f, 8f, 9f) - ).toNDArray - ) - } - - "CollectionLikeNDArray from Long " should "provides filter API" in { - val ndArray = - Array( - Array(1L, 2L, 3L), - Array(4L, 5L, 6L), - Array(7L, 8L, 9L) - ).toNDArray - - val filtered = ndArray.filter(_ > 3) - - assert( - filtered == - Array( - Array(0L, 0L, 0L), - Array(4L, 5L, 6L), - Array(7L, 8L, 9L) - ).toNDArray - ) - } - - it should "provides filter bitmask API" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - - val filterMasked = ndArray.filterBit(_ % 2 == 0) - - assert( - filterMasked == - Array( - Array(0, 1, 0), - Array(1, 0, 1), - Array(0, 1, 0) - ).toNDArray - ) - } - it should "provides map API" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - - val mapped = ndArray.map(_ * 2 + 1) - - assert( - mapped == - Array( - Array(3, 5, 7), - Array(9, 11, 13), - Array(15, 17, 19) - ).toNDArray - ) - } - - it should "provides forall checker" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - - //check if all elements in nd meet the criteria. - assert(ndArray > 0) - assert(ndArray.forall(_ > 0)) - "ndArray.forallC(_.absoluteValue().doubleValue() > 0)" shouldNot typeCheck - assert(ndArray < 10) - assert(!(ndArray >= 5)) - } - - it should "provides exist API" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - - //check if any element in nd meet the criteria. - assert(ndArray.exists(_ > 8)) - } - - it should "provides existTyped API" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - - //check if any element in nd meet the criteria. - assert(ndArray.existsTyped[Int](_ > 8)(IntNDArrayEvidence)) - } - - "CollectionLikeNDArray" should "provides forAll API" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - - val resultFalse = ndArray.forall(_ > 3) - assert(false == resultFalse) - - val resultTrue = ndArray.forall(_ < 10) - assert(true == resultTrue) - } - - "CollectionLikeNDArray" should "provides forAllTyped API" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).toNDArray - - val results = ndArray.forallTyped[Int](_ > 3)(IntNDArrayEvidence) - assert(false == results) - } - - "FunctionalOpExecutioner" should "allow debug and verbose" in { - val executioner = new FunctionalOpExecutioner - executioner.enableDebugMode(true) - executioner.enableVerboseMode(true) - - assert(executioner.isDebug) - assert(executioner.isVerbose) - } - - "FunctionalOpExecutioner" should "provide access to environment information" in { - FunctionalOpExecutioner.apply.printEnvironmentInformation() - val environment = FunctionalOpExecutioner.apply.getEnvironmentInformation - assert(environment != null) - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayConstructionTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayConstructionTest.scala deleted file mode 100644 index 4b82ee685..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayConstructionTest.scala +++ /dev/null @@ -1,123 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4j.linalg.api.buffer.DataType -import org.nd4s.Implicits._ -import org.scalatest.FlatSpec - -class NDArrayConstructionTest extends FlatSpec with COrderingForTest { - self: OrderingForTest => - - it should "be able to create 2d matrix filled with integers" in { - val ndArray = - Array( - Array(1, 2), - Array(4, 5), - Array(7, 9) - ).mkNDArray(ordering) - - assert(DataType.INT == ndArray.dataType()) - assert(3 == ndArray.rows()) - assert(2 == ndArray.columns()) - } - - it should "be able to create 2d matrix filled with long integers" in { - val ndArray = - Array( - Array(1L, 2L, 3L), - Array(4L, 5L, 6L), - Array(7L, 8L, 9L) - ).mkNDArray(ordering) - - assert(DataType.LONG == ndArray.dataType()) - assert(3 == ndArray.rows()) - assert(3 == ndArray.columns()) - } - - it should "be able to create 2d matrix filled with float numbers" in { - val ndArray = - Array( - Array(1f, 2f, 3f), - Array(4f, 5f, 6f), - Array(7f, 8f, 9f) - ).mkNDArray(ordering) - - assert(DataType.FLOAT == ndArray.dataType()) - assert(3 == ndArray.rows()) - assert(3 == ndArray.columns()) - } - - it should "be able to create 2d matrix filled with double numbers" in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).mkNDArray(ordering) - - assert(DataType.DOUBLE == ndArray.dataType()) - assert(3 == ndArray.rows()) - assert(3 == ndArray.columns()) - } - - it should "be able to create vector filled with short integers" in { - val ndArray = Array[Short](1, 2, 3).toNDArray - - assert(DataType.SHORT == ndArray.dataType()) - assert(1 == ndArray.rows()) - assert(3 == ndArray.columns()) - } - - it should "be able to create vector filled with byte values" in { - val ndArray = Array[Byte](1, 2, 3).toNDArray - - assert(DataType.BYTE == ndArray.dataType()) - assert(1 == ndArray.rows()) - assert(3 == ndArray.columns()) - } - - it should "be able to create vector filled with boolean values" in { - val ndArray = Array(true, false, true).toNDArray - - assert(DataType.BOOL == ndArray.dataType()) - assert(1 == ndArray.rows()) - assert(3 == ndArray.columns()) - } - - it should "be able to create vector from integer range" in { - val list = (0 to 9).toNDArray - assert(DataType.INT == list.dataType()) - - val stepped = list(1 -> 7 by 2) - assert(Array(1, 3, 5).toNDArray == stepped) - assert(DataType.INT == list.dataType()) - } - - it should "be able to create vector from strings" in { - val oneString = "testme".toScalar - assert("testme" == oneString.getString(0)) - assert(DataType.UTF8 == oneString.dataType()) - - val someStrings = Array[String]("one", "two", "three").toNDArray - assert("one" == someStrings.getString(0)) - assert("two" == someStrings.getString(1)) - assert("three" == someStrings.getString(2)) - assert(DataType.UTF8 == someStrings.dataType()) - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayExtractionTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayExtractionTest.scala deleted file mode 100644 index 147c3b415..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayExtractionTest.scala +++ /dev/null @@ -1,327 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4s.Implicits._ -import org.scalatest.FlatSpec - -class NDArrayExtractionInCOrderingTest extends NDArrayExtractionTestBase with COrderingForTest -class NDArrayExtractionInFortranOrderingTest extends NDArrayExtractionTestBase with FortranOrderingForTest - -trait NDArrayExtractionTestBase extends FlatSpec { self: OrderingForTest => - - "org.nd4j.api.Implicits.RichNDArray" should "be able to extract a value in specified indices" in { - val ndArray = Array( - Array(1, 2), - Array(3, 4) - ).mkNDArray(ordering) - } - - it should "be able to extract a part of 2d matrix" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(ordering) - - val extracted = ndArray(1 -> 3, 0 -> 2) - - val expected = - Array( - Array(4, 5), - Array(7, 8) - ).mkNDArray(ordering) - assert(extracted == expected) - } - - it should "be able to extract a part of 2d matrix with alternative syntax" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(ordering) - - val extracted = ndArray(1 :: 3, 0 :: 2) - - val expected = - Array( - Array(4, 5), - Array(7, 8) - ).mkNDArray(ordering) - assert(extracted == expected) - } - - it should "be able to extract a part of 2d matrix with mixed syntax" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(ordering) - - val extracted = ndArray(1 -> 3, 0 :: 2) - - val expected = - Array( - Array(4, 5), - Array(7, 8) - ).mkNDArray(ordering) - assert(extracted == expected) - } - - it should "be able to extract a part of 2d matrix with double data" in { - val ndArray = (5 to 8).map(_.toDouble).mkNDArray(Array(2, 2), NDOrdering.C) - - val expectedArray = Array( - Array(5d, 6d), - Array(7d, 8d) - ).mkNDArray(ordering) - assert(ndArray == expectedArray) - - val expectedSlice = Array( - Array(5d), - Array(7d) - ).toNDArray - assert(expectedArray(->, 0 -> 1) == expectedSlice) - } - - it should "be able to extract a part of 2d matrix with integer data" in { - val ndArray = (1 to 9).mkNDArray(Array(2, 2)) - - val expectedArray = Array( - Array(1, 2), - Array(3, 4) - ).mkNDArray(ordering) - assert(ndArray == expectedArray) - - val expectedSlice = Array( - Array(1), - Array(3) - ).toNDArray - val actualSlice = expectedArray(->, 0 -> 1) - assert(actualSlice == expectedSlice) - } - - it should " provide overloaded -> operator providing matrix slices as nd4j" in { - - val expectedArray = (1 to 9).mkNDArray(Array(2, 2)) - val expectedSlice = expectedArray.slice(0) - val actualSlice = expectedArray(0, ->) - -// Console.println(expectedSlice) - - assert(actualSlice == expectedSlice) - } - - it should "be able to extract a part of vertically long matrix in" in { - val ndArray = - Array( - Array(1, 2), - Array(3, 4), - Array(5, 6), - Array(7, 8) - ).mkNDArray(ordering) - - assert( - ndArray(0 -> 2, ->) == - Array( - Array(1, 2), - Array(3, 4) - ).mkNDArray(ordering) - ) - - assert( - ndArray(2 -> 4, ->) == - Array( - Array(5, 6), - Array(7, 8) - ).mkNDArray(ordering) - ) - } - - it should "be able to extract a part of horizontally long matrix" in { - val ndArray = - Array( - Array(1, 2, 3, 4), - Array(5, 6, 7, 8) - ).mkNDArray(ordering) - - assert( - ndArray(->, 0 -> 2) == - Array( - Array(1, 2), - Array(5, 6) - ).mkNDArray(ordering) - ) - - assert( - ndArray(->, 2 -> 4) == - Array( - Array(3, 4), - Array(7, 8) - ).mkNDArray(ordering) - ) - } - - it should "be able to extract a part of 3d matrix" in { - val ndArray = (1 to 8).mkNDArray(Array(2, 2, 2), ordering) - - val extracted = ndArray(0, ->, ->) - val expected = ndArray.slice(0) - assert(extracted == expected) - } - - it should "return original NDArray if indexRange is all in 2d matrix" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(ordering) - val extracted = ndArray(->, ->) - assert(ndArray == extracted) - - val ellipsised = ndArray(--->) - assert(ellipsised == ndArray) - } - - it should "return original NDArray if indexRange is all in 3d matrix" in { - val ndArray = (1f to 8f by 1).mkNDArray(Array(2, 2, 2), ordering) - val extracted = ndArray(->, ->, ->) - assert(ndArray == extracted) - - val ellipsised = ndArray(--->) - assert(ellipsised == ndArray) - - val ellipsised1 = ndArray(---) - assert(ellipsised1 == ndArray) - } - - it should "accept partially ellipsis indices" in { - val ndArray = (1f to 8f by 1).mkNDArray(Array(2, 2, 2), ordering) - - val ellipsised = ndArray(--->, 0) - val notEllipsised = ndArray(->, ->, 0) - assert(ellipsised == notEllipsised) - - val ellipsisedAtEnd = ndArray(0, --->) - val notEllipsisedAtEnd = ndArray(0, ->, ->) - assert(ellipsisedAtEnd == notEllipsisedAtEnd) - - val ellipsisedOneHand = ndArray(0 ->, ->, ->) - val notEllipsisedOneHand = ndArray(->, ->, ->) - assert(ellipsisedOneHand == notEllipsisedOneHand) - } - - // TODO: fix me. This is about INDArray having to be sliced by LONG indices - // can't find the correct way to fix implicits without breaking other stuff. - it should "be able to extract sub-matrix with index range by step" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(ordering).reshape(3, 3) - - val extracted = ndArray(0 -> 3 by 2, ->) - val extractedWithRange = ndArray(0 until 3 by 2, ->) - val extractedWithInclusiveRange = ndArray(0 to 2 by 2, ->) - - val expected = - Array( - Array(1, 2, 3), - Array(7, 8, 9) - ).mkNDArray(ordering) - - assert(extracted == expected) - assert(extractedWithRange == expected) - assert(extractedWithInclusiveRange == expected) - - /* - Equivalent with NumPy document examples. - @see http://docs.scipy.org/doc/numpy/reference/arrays.indexing.html#basic-slicing-and-indexing - */ - val list = (0 to 9).toNDArray - val step = list(1 -> 7 by 2).reshape(-1) - assert(step.length() == 3) - assert(step.getFloat(0: Long) == 1) - assert(step(0) == 1) - assert(step(0, 0) == 1) - assert(step.getFloat(1: Long) == 3) - assert(step.getFloat(2: Long) == 5) - - val filtered = list(-2 -> 10).reshape(-1) - assert(filtered.length() == 2) - assert(filtered.getFloat(0: Long) == 8) - assert(filtered.getFloat(1: Long) == 9) - - val nStep = list(-3 -> 3 by -1).reshape(-1) - assert(nStep.length() == 4) - assert(nStep.getFloat(0: Long) == 7) - assert(nStep.getFloat(1: Long) == 6) - assert(nStep.getFloat(2: Long) == 5) - assert(nStep.getFloat(3: Long) == 4) - } - - it should "be able to update value with specified indices" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(ordering) - - ndArray(0 -> 3 by 2, ->) = 0 - - assert( - ndArray == Array( - Array(0, 0, 0), - Array(4, 5, 6), - Array(0, 0, 0) - ).mkNDArray(ordering) - ) - } - - it should "be able to update INDArray with specified indices" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(ordering) - - ndArray(0 -> 2, 0 -> 2) = Array(Array(0, 1), Array(2, 3)).mkNDArray(ordering) - - assert( - ndArray == Array( - Array(0, 1, 3), - Array(2, 3, 6), - Array(7, 8, 9) - ).mkNDArray(ordering) - ) - } - - "num2Scalar" should "convert number to Scalar INDArray" in { - - assert(1.toScalar.reshape(1) == List(1).toNDArray) - assert(2f.toScalar.reshape(1) == List(2f).toNDArray) - assert(3d.toScalar.reshape(1) == List(3d).toNDArray) - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayIndexingTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayIndexingTest.scala deleted file mode 100644 index c15e19b7c..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayIndexingTest.scala +++ /dev/null @@ -1,68 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4s.Implicits._ -import org.nd4j.linalg.indexing.{ IntervalIndex, NDArrayIndexAll, PointIndex } -import org.scalatest.FlatSpec - -class NDArrayIndexingTest extends FlatSpec { - "IndexRange" should "convert -> DSL to indices" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(NDOrdering.C) - - val indices = ndArray.indicesFrom(0 -> 2, ->) - assert(indices.indices == List(0, 1, 2, 3, 4, 5)) - assert(indices.targetShape.toList == List(2, 3)) - } - - it should "convert -> DSL to NDArrayIndex interval with stride 1 or 2" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(NDOrdering.C) - - val indices = ndArray.getINDArrayIndexfrom(0 -> 2, 0 -> 3 by 2) - val rowI = indices(0) - assert(rowI.isInstanceOf[IntervalIndex]) - - val columnI = indices(1) - assert(columnI.isInstanceOf[IntervalIndex]) - } - it should "convert -> DSL to NDArrayIndex point,all" in { - val ndArray = - Array( - Array(1, 2, 3), - Array(4, 5, 6), - Array(7, 8, 9) - ).mkNDArray(NDOrdering.C) - - val indices = ndArray.getINDArrayIndexfrom(0, ->) - val rowI = indices(0) - assert(rowI.isInstanceOf[PointIndex]) - - val columnI = indices(1) - assert(columnI.isInstanceOf[NDArrayIndexAll]) - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayProjectionAPITest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayProjectionAPITest.scala deleted file mode 100644 index 4afdefab5..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/NDArrayProjectionAPITest.scala +++ /dev/null @@ -1,312 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.nd4s.Implicits._ -import org.scalatest.{ FlatSpec, Matchers } - -class NDArrayProjectionAPITest extends FlatSpec { - "ColumnProjectedNDArray" should "map column correctly" in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = for { - c <- ndArray.columnP - if c.get(0) % 2 == 0 - } yield c * c - - assert( - result == Array( - Array(4d), - Array(25d), - Array(64d) - ).toNDArray - ) - } - - "ColumnProjectedNDArray" should "map column correctly 2" in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = ndArray.columnP map (input => input + 1) - assert( - result == Array( - Array(2d, 3d, 4d), - Array(5d, 6d, 7d), - Array(8d, 9d, 10d) - ).toNDArray - ) - } - - "ColumnProjectedNDArray" should "map column correctly 3" in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = ndArray.columnP flatMap (input => input + 1) - assert( - result == Array( - Array(2d, 3d, 4d), - Array(5d, 6d, 7d), - Array(8d, 9d, 10d) - ).toNDArray - ) - } - - "ColumnProjectedNDArray" should "map column correctly in place " in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - ndArray.columnP flatMapi (input => input + 1) - assert( - ndArray == Array( - Array(2d, 3d, 4d), - Array(5d, 6d, 7d), - Array(8d, 9d, 10d) - ).toNDArray - ) - } - - "ColumnProjectedNDArray" should "map column correctly 4" in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = ndArray.columnP map (input => input + 1) - assert( - result == Array( - Array(2d, 3d, 4d), - Array(5d, 6d, 7d), - Array(8d, 9d, 10d) - ).toNDArray - ) - } - - "ColumnProjectedNDArray" should "map column correctly 5" in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - ndArray.columnP mapi (input => input + 1) - assert( - ndArray == Array( - Array(2d, 3d, 4d), - Array(5d, 6d, 7d), - Array(8d, 9d, 10d) - ).toNDArray - ) - } - - "ColumnProjectedNDArray" should "flatmap column correctly" in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = ndArray.columnP withFilter (input => false) - assert(result.filtered.isEmpty) - } - - "RowProjectedNDArray" should "map row correctly in for loop " in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = for { - c <- ndArray.rowP - if c.get(0) % 2 == 0 - } yield c * c - - assert( - result == - Array(Array(16d, 25d, 36d)).toNDArray - ) - } - - "RowProjectedNDArray" should "map row correctly " in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = ndArray.rowP map (input => input / 2) - - assert( - result == - Array[Double](0.5000, 1.0000, 1.5000, 2.0000, 2.5000, 3.0000, 3.5000, 4.0000, 4.5000).toNDArray.reshape(3, 3) - ) - } - - "RowProjectedNDArray" should "filter rows correctly " in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = ndArray.rowP withFilter (input => false) - assert(result.filtered.isEmpty) - } - - "RowProjectedNDArray" should "flatMap rows correctly " in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = ndArray.rowP flatMap (input => input + 1) - val expected = - Array( - Array(2d, 3d, 4d), - Array(5d, 6d, 7d), - Array(8d, 9d, 10d) - ).toNDArray - - assert(result == expected) - } - - "RowProjectedNDArray" should "map row correctly 2 " in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - val result = ndArray.rowP map (input => input / 2) - - assert( - result == - Array[Double](0.5000, 1.0000, 1.5000, 2.0000, 2.5000, 3.0000, 3.5000, 4.0000, 4.5000).toNDArray.reshape(3, 3) - ) - } - - "RowProjectedNDArray" should "flatMap in place rows correctly " in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - ndArray.rowP flatMapi (input => input + 1) - val expected = - Array( - Array(2d, 3d, 4d), - Array(5d, 6d, 7d), - Array(8d, 9d, 10d) - ).toNDArray - - assert(ndArray == expected) - } - - "RowProjectedNDArray" should "map in place rows correctly " in { - val ndArray = - Array( - Array(1d, 2d, 3d), - Array(4d, 5d, 6d), - Array(7d, 8d, 9d) - ).toNDArray - - ndArray.rowP mapi (input => input / 2) - - assert( - ndArray == - Array[Double](0.5000, 1.0000, 1.5000, 2.0000, 2.5000, 3.0000, 3.5000, 4.0000, 4.5000).toNDArray.reshape(3, 3) - ) - } - - "SliceProjectedNDArray" should "map slice correctly" in { - val ndArray = - (1d to 8d by 1).asNDArray(2, 2, 2) - - val result = for { - slice <- ndArray.sliceP - if slice.get(0) > 1 - } yield slice * slice - - assert(result == List(25d, 36d, 49d, 64d).asNDArray(1, 2, 2)) - } - - "SliceProjectedNDArray" should "flatmap slice correctly" in { - val ndArray = - (1d to 8d by 1).asNDArray(2, 2, 2) - - val result = ndArray.sliceP flatMap (input => input * 2) - val expected = - (2d to 16d by 2).asNDArray(2, 2, 2) - assert(result == expected) - } - - "SliceProjectedNDArray" should "flatmap slice correctly in place" in { - val ndArray = - (1d to 8d by 1).asNDArray(2, 2, 2) - - ndArray.sliceP flatMapi (input => input * 2) - val expected = - (2d to 16d by 2).asNDArray(2, 2, 2) - assert(ndArray == expected) - } - - "SliceProjectedNDArray" should "map slice correctly in place" in { - val ndArray = - (1d to 8d by 1).asNDArray(2, 2, 2) - - ndArray.sliceP mapi (input => input * 2) - val expected = - (2d to 16d by 2).asNDArray(2, 2, 2) - assert(ndArray == expected) - } - - "SliceProjectedNDArray" should "filter slice correctly" in { - val ndArray = (1d until 9d by 1).asNDArray(2, 2, 2) - val result = ndArray.sliceP withFilter (input => false) - assert(result.filtered.isEmpty) - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/OperatableNDArrayTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/OperatableNDArrayTest.scala deleted file mode 100644 index 98edbf98f..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/OperatableNDArrayTest.scala +++ /dev/null @@ -1,283 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.junit.runner.RunWith -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4s.Implicits._ -import org.nd4j.linalg.factory.Nd4j -import org.scalatest.junit.JUnitRunner -import org.scalatest.{ FlatSpec, Matchers } - -@RunWith(classOf[JUnitRunner]) -class OperatableNDArrayTest extends FlatSpec with Matchers { - "RichNDArray" should "use the apply method to access values" in { - // -- 2D array - val nd2 = Nd4j.create(Array[Double](1, 2, 3, 4), Array[Int](1, 4)) - - nd2.get(0) should be(1) - nd2.get(0, 3) should be(4) - - // -- 3D array - val nd3 = Nd4j.create(Array[Double](1, 2, 3, 4, 5, 6, 7, 8), Array[Int](2, 2, 2)) - nd3.get(0, 0, 0) should be(1) - nd3.get(1, 1, 1) should be(8) - - } - - it should "use transpose abbreviation" in { - val nd1 = Nd4j.create(Array[Double](1, 2, 3), Array(3, 1)) - nd1.shape should equal(Array(3, 1)) - val nd1t = nd1.T - nd1t.shape should equal(Array(1, 3)) - } - - it should "add correctly" in { - val a = Nd4j.create(Array[Double](1, 2, 3, 4, 5, 6, 7, 8), Array(2, 2, 2)) - val b = a + 100 - a.get(0, 0, 0) should be(1) - b.get(0, 0, 0) should be(101) - a += 1 - a.get(0, 0, 0) should be(2) - } - - it should "subtract correctly" in { - val a = Nd4j.create(Array[Double](1, 2, 3, 4, 5, 6, 7, 8), Array(2, 2, 2)) - val b = a - 100 - a.get(0, 0, 0) should be(1) - b.get(0, 0, 0) should be(-99) - a -= 1 - a.get(0, 0, 0) should be(0) - - val c = Nd4j.create(Array[Double](1, 2)) - val d = c - c - d.get(0) should be(0) - d.get(1) should be(0) - } - - it should "divide correctly" in { - val a = Nd4j.create(Array[Double](1, 2, 3, 4, 5, 6, 7, 8), Array(2, 2, 2)) - val b = a / a - a.get(1, 1, 1) should be(8) - b.get(1, 1, 1) should be(1) - a /= a - a.get(1, 1, 1) should be(1) - } - - it should "element-by-element multiply correctly" in { - val a = Nd4j.create(Array[Double](1, 2, 3, 4), Array(4, 1)) - val b = a * a - a.get(3) should be(4) // [1.0, 2.0, 3.0, 4.0 - b.get(3) should be(16) // [1.0 ,4.0 ,9.0 ,16.0] - a *= 5 // [5.0 ,10.0 ,15.0 ,20.0] - a.get(0) should be(5) - } - - it should "use the update method to mutate values" in { - val nd3 = Nd4j.create(Array[Double](1, 2, 3, 4, 5, 6, 7, 8), Array(2, 2, 2)) - nd3(0) = 11 - nd3.get(0) should be(11) - - val idx = Array(1, 1, 1) - nd3(idx) = 100 - nd3.get(idx) should be(100) - } - - it should "use === for equality comparisons" in { - val a = Nd4j.create(Array[Double](1, 2)) - - val b = Nd4j.create(Array[Double](1, 2)) - val c = a === b - c.get(0) should be(1) - c.get(1) should be(1) - - val d = Nd4j.create(Array[Double](10, 20)) - val e = a === d - e.get(0) should be(0) - e.get(1) should be(0) - - val f = a === 1 // === from our DSL - f.get(0) should be(1) - f.get(1) should be(0) - } - - it should "use - prefix for negation" in { - val a = Nd4j.create(Array[Float](1, 3)) - val b = -a - b.get(0) should be(-1) - b.get(1) should be(-3) - } - - it should "not prevent any2stringadd syntax" in { - val s: String = Nd4j.create(2, 2) + "" - } - - "Sum function" should "choose return value depending on INDArray type" in { - val ndArray = - Array( - Array(1, 2), - Array(4, 5) - ).toNDArray - - //return Double in real NDArray at default - ndArray.get(0) shouldBe a[java.lang.Double] - val sumValue = ndArray.sumT - sumValue shouldBe a[java.lang.Double] - - //switch return value with passing corresponding evidence explicitly - val sumValueInFloatExplicit = ndArray.sumT(FloatNDArrayEvidence) - sumValueInFloatExplicit shouldBe a[java.lang.Float] - - //switch return value with declaring implicit value but explicit one would be more readable. - import org.nd4s.Evidences.float - val sumValueInFloatImplicit = ndArray.sumT - sumValueInFloatImplicit shouldBe a[java.lang.Float] - } - - it should "provide matrix multiplicaton operations " in { - val a = Nd4j.create(Array[Float](4, 6, 5, 7)).reshape(2, 2) - val b = Nd4j.create(Array[Float](1, 3, 4, 8)).reshape(2, 2) - a **= b - val expected = Array[Float](28.0000f, 60.0000f, 33.0000f, 71.0000f).toNDArray.reshape(2, 2) - a shouldBe expected - } - - it should "provide matrix division operations " in { - val a = Nd4j.create(Array[Float](4, 6, 5, 7)).reshape(2, 2) - a /= 12 - a.get(0) shouldBe (0.3333 +- 0.0001) - a.get(1) shouldBe (0.5 +- 0.0001) - a.get(2) shouldBe (0.4167 +- 0.0001) - a.get(3) shouldBe (0.5833 +- 0.0001) - - val b = Nd4j.create(Array[Float](4, 6, 5, 7)).reshape(2, 2) - b %= 12 - b.get(0) shouldBe (4.0) - b.get(1) shouldBe (6.0) - b.get(2) shouldBe (5.0) - b.get(3) shouldBe (-5.0) - - val c = Nd4j.create(Array[Float](4, 6, 5, 7)).reshape(2, 2) - c \= 12 - c.get(0) shouldBe (3.0) - c.get(1) shouldBe (2.0) - c.get(2) shouldBe (2.4000 +- 0.0001) - c.get(3) shouldBe (1.7143 +- 0.0001) - } - - it should "provide math operations for vectors " in { - val a = Nd4j.create(Array[Float](4, 6)) - val b = Nd4j.create(Array[Float](1, 3)) - a /= b - val expected1 = Nd4j.create(Array[Float](4, 2)) - assert(a == expected1) - - a *= b - val expected2 = Nd4j.create(Array[Float](4, 6)) - assert(a == expected2) - - a += b - val expected3 = Nd4j.create(Array[Float](5, 9)) - assert(a == expected3) - - a -= b - val expected4 = Nd4j.create(Array[Float](4, 6)) - assert(a == expected4) - - a \= b - val expected5 = Array[Float](0.25f, 0.5f).toNDArray - assert(a == expected5) - - val c = a * b - val expected6 = Array[Float](0.25f, 1.5f).toNDArray - assert(c == expected6) - - val d = a + b - val expected7 = Array[Float](1.25f, 3.5f).toNDArray - assert(d == expected7) - - val e = a / b - e.get(0) should be(0.2500 +- 0.0001) - e.get(1) should be(0.1667 +- 0.0001) - - val f = a \ b - f.get(0) should be(4.0 +- 0.0001) - f.get(1) should be(6.0 +- 0.0001) - - val g = a ** b - g.get(0) shouldBe 1.7500 - - val h = a dot b - g.get(0) shouldBe 1.7500 - - d.sumT shouldBe 4.75 - - d.meanT shouldBe 2.375 - - d.norm1T shouldBe 4.75 - - d.maxT shouldBe 3.5 - - d.minT shouldBe 1.25 - - d.prodT shouldBe 4.375 - - d.varT shouldBe 2.53125 - - d.norm2T should be(3.7165 +- 0.0001) - - d.stdT should be(1.5909 +- 0.0001) - } - - it should "provide arithmetic ops calls on integers " in { - val ndArray = Array(1, 2).toNDArray - val c = ndArray + 5 - c shouldBe Array(6, 7).toNDArray - - val d = 5 + ndArray - c shouldBe Array(6, 7).toNDArray - } - - it should "broadcast add ops calls on vectors with different length " in { - val x = Array(1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f).mkNDArray(Array(3, 5)) - val y = Array[Float](1f, 1f, 1f, 1f, 1f).toNDArray - val e = x + 1f.toScalar - assert((x + y) == e) - - val x1 = Array(1f, 1f, 1f, 1f, 1f, 1f).mkNDArray(Array(3, 1, 2)) - val y1 = Array[Float](1f, 1f, 1f, 1f).toNDArray.reshape(2, 2) - val t1 = Array(1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f).mkNDArray(Array(3, 2, 2)) - val e1 = t1 + 1f - assert((x1 + y1) == e1) - - val e2 = 1f + t1 - assert(e1 == e2) - } - - it should "broadcast multiplication ops " in { - - val x1 = Array(1f, 1f, 1f, 1f, 1f, 1f).mkNDArray(Array(3, 1, 2)) - val y1 = Array[Float](1f, 1f, 1f, 1f).toNDArray.reshape(2, 2) - val t1 = Array(1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f, 1f).mkNDArray(Array(3, 2, 2)) - val e1 = t1 * 1f - assert((x1 * y1) == e1) - - val e2 = 1f * t1 - assert(e1 == e2) - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/OrderingForTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/OrderingForTest.scala deleted file mode 100644 index 563429f88..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/OrderingForTest.scala +++ /dev/null @@ -1,30 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s - -import org.scalatest.{ Suite, SuiteMixin } - -trait OrderingForTest extends SuiteMixin { this: Suite => - val ordering: NDOrdering -} -trait COrderingForTest extends OrderingForTest { this: Suite => - override val ordering: NDOrdering = NDOrdering.C -} -trait FortranOrderingForTest extends OrderingForTest { this: Suite => - override val ordering: NDOrdering = NDOrdering.Fortran -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/ConstructionTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/ConstructionTest.scala deleted file mode 100644 index 6f83d82db..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/ConstructionTest.scala +++ /dev/null @@ -1,180 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.samediff - -import org.nd4j.autodiff.samediff.{ SDVariable, SameDiff, TrainingConfig } -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.dataset.MultiDataSet -import org.nd4j.linalg.dataset.adapter.SingletonMultiDataSetIterator -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.learning.config.Sgd -import org.nd4s.Implicits._ -import org.nd4s.samediff.implicits.Implicits._ -import org.scalatest.{ FlatSpec, Matchers } - -class ConstructionTest extends FlatSpec with Matchers { - - "SameDiff" should "allow composition of arithmetic operations" in { - - val sd = SameDiff.create() - val ph1 = sd.placeHolder("ph1", DataType.FLOAT, 3, 4) - val w1 = sd.bind("w1", Nd4j.rand(DataType.FLOAT, 4, 5)) - val b1 = sd.bind("b1", Nd4j.rand(DataType.FLOAT, 5)) - - val mmul1 = ph1 * w1 - val badd1 = mmul1 + b1 - - val loss1 = badd1.std("loss1", true) - - sd.setLossVariables("loss1") - sd.createGradFunction - for (v <- Array[SDVariable](ph1, w1, b1, mmul1, badd1, loss1)) { - assert(v.getVarName != null && v.gradient != null) - } - } - - "SameDiff" should "provide arithmetic operations for float arguments in arbitrary order" in { - - implicit val sd = SameDiff.create() - val w1 = sd.bind("w1", 4.0f.toScalar) - var evaluated = w1.eval.castTo(DataType.FLOAT) - evaluated.toFloatVector.head shouldBe 4.0f - - val w2 = w1 * 2.0f - w2.eval.toFloatVector.head shouldBe 8.0f - val w3 = w2 + 2.0f - w3.eval.toFloatVector.head shouldBe 10.0f - - val w4 = 2.0f * w1 - w4.eval.toFloatVector.head shouldBe 8.0f - val w5 = 2.0f + w2 - w5.eval.toFloatVector.head shouldBe 10.0f - - val w6 = w1 / 2.0f - w6.eval.toFloatVector.head shouldBe 2.0f - val w7 = w2 - 2.0f - w7.eval.toFloatVector.head shouldBe 6.0f - - val w8 = 2.0f / w1 - w8.eval.toFloatVector.head shouldBe 2.0f - - val w9 = 2.0f - w2 - w9.eval.toFloatVector.head shouldBe 6.0f - } - - "SameDiff" should "provide arithmetic operations for double arguments in arbitrary order" in { - implicit val sd = SameDiff.create() - val w1 = sd.bind("w1", 4.0.toScalar) - var evaluated = w1.eval.castTo(DataType.DOUBLE) - evaluated.toFloatVector.head shouldBe 4.0 - - val w2 = w1 * 2.0 - w2.eval.toFloatVector.head shouldBe 8.0 - val w3 = w2 + 2.0 - w3.eval.toFloatVector.head shouldBe 10.0 - - val w4 = 2.0 * w1 - w4.eval.toFloatVector.head shouldBe 8.0 - val w5 = 2.0 + w2 - w5.eval.toFloatVector.head shouldBe 10.0 - - val w6 = w1 / 2.0 - w6.eval.toFloatVector.head shouldBe 2.0 - val w7 = w2 - 2.0 - w7.eval.toFloatVector.head shouldBe 6.0 - - val w8 = 2.0 / w1 - w8.eval.toFloatVector.head shouldBe 2.0 - val w9 = 2.0 - w2 - w9.eval.toFloatVector.head shouldBe 6.0f - } - - "SameDiff" should "provide unary math operators" in { - implicit val sd = SameDiff.create() - val w1 = sd.bind("w1", 4.0.toScalar) - var evaluated = w1.eval.castTo(DataType.DOUBLE) - evaluated.toFloatVector.head shouldBe 4.0 - - val w2 = -w1 - var evaluated2 = w2.eval.castTo(DataType.DOUBLE) - evaluated2.toFloatVector.head shouldBe -4.0 - - val w3 = w1 ** 2 - var evaluated3 = w3.eval.castTo(DataType.DOUBLE) - evaluated3.toFloatVector.head shouldBe 16.0 - } - - "classification example" should "work" in { - val learning_rate = 0.1 - val seed = 7 - - val target = Nd4j.createUninitialized(DataType.DOUBLE, 1000) - val rng = Nd4j.getRandom - rng.setSeed(seed) - val x1_label1 = Nd4j.randn(3.0, 1.0, target, rng) - val target1 = Nd4j.createUninitialized(DataType.DOUBLE, 1000) - val x2_label1 = Nd4j.randn(2.0, 1.0, target1, rng) - val target2 = Nd4j.createUninitialized(DataType.DOUBLE, 1000) - val x1_label2 = Nd4j.randn(7.0, 1.0, target2, rng) - val target3 = Nd4j.createUninitialized(DataType.DOUBLE, 1000) - val x2_label2 = Nd4j.randn(6.0, 1.0, target3, rng) - - // np.append, was not able to guess proper method - val x1s = Nd4j.concat(0, x1_label1, x1_label2) - val x2s = Nd4j.concat(0, x2_label1, x2_label2) - - // Must have implicit sd here for some ops - implicit val sd = SameDiff.create - val ys = (Nd4j.scalar(0.0) * x1_label1.length()) + (Nd4j.scalar(1.0) * x1_label2.length()) - - // Empty shape can't be passed vs tf behaviour - val X1 = sd.placeHolder("x1", DataType.DOUBLE, 2000) - val X2 = sd.placeHolder("x2", DataType.DOUBLE, 2000) - val y = sd.placeHolder("y", DataType.DOUBLE) - val w = sd.bind("w", DataType.DOUBLE, Array[Int](3)) - //Sample: -tf.log(y_model * Y + (1 — y_model) * (1 — Y)) - val y_model: SDVariable = - sd.nn.sigmoid(w(2) * X2 + w(1) * X1 + w(0)) - val cost_fun: SDVariable = (sd.math.neg( - sd.math.log(y_model * y + (sd.math.log(sd.constant(1.0) - y_model) * (sd.constant(1.0) - y))) - )) - val loss = sd.mean("loss", cost_fun) - - val updater = new Sgd(learning_rate) - - sd.setLossVariables("loss") - sd.createGradFunction - val conf = new TrainingConfig.Builder() - .updater(updater) - .minimize("loss") - .dataSetFeatureMapping("x1", "x2", "y") - .markLabelsUnused() - .build() - - val mds = new MultiDataSet(Array[INDArray](x1s, x2s, ys), new Array[INDArray](0)) - - sd.setTrainingConfig(conf) - sd.fit(new SingletonMultiDataSetIterator(mds), 1) - - w.getArr.get(0) shouldBe (0.0629 +- 0.0001) - w.getArr.get(1) shouldBe (0.3128 +- 0.0001) - w.getArr.get(2) shouldBe (0.2503 +- 0.0001) - //Console.println(w.eval) - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/MathTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/MathTest.scala deleted file mode 100644 index 493c3f11a..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/MathTest.scala +++ /dev/null @@ -1,248 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.nd4s.samediff - -import org.nd4j.autodiff.samediff.{ SDIndex, SDVariable, SameDiff } -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.factory.Nd4j -import org.nd4s.Implicits._ -import org.nd4s.NDOrdering -import org.nd4s.samediff.implicits.Implicits._ -import org.scalatest.{ FlatSpec, Matchers } - -class MathTest extends FlatSpec with Matchers { - - "SameDiff" should "allow composition of arithmetic operations" in { - - val sd = SameDiff.create() - val ph1 = sd.placeHolder("ph1", DataType.FLOAT, 3, 4) - val w1 = sd.bind("w1", Nd4j.rand(DataType.FLOAT, 4, 5)) - val b1 = sd.bind("b1", Nd4j.rand(DataType.FLOAT, 5)) - - val mmul1 = ph1 * w1 - val badd1 = mmul1 + b1 - - val loss1 = badd1.std("loss1", true) - - sd.setLossVariables("loss1") - sd.createGradFunction - for (v <- Array[SDVariable](ph1, w1, b1, mmul1, badd1, loss1)) { - assert(v.getVarName != null && v.gradient != null) - } - } - - "SameDiff" should "provide arithmetic operations for float arguments in arbitrary order" in { - - implicit val sd = SameDiff.create() - val w1 = sd.bind("w1", 4.0f.toScalar) - var evaluated = w1.eval.castTo(DataType.FLOAT) - evaluated.toFloatVector.head shouldBe 4.0f - - val w2 = w1 * 2.0f - w2.eval.toFloatVector.head shouldBe 8.0f - val w3 = w2 + 2.0f - w3.eval.toFloatVector.head shouldBe 10.0f - - val w4 = 2.0f * w1 - w4.eval.toFloatVector.head shouldBe 8.0f - val w5 = 2.0f + w2 - w5.eval.toFloatVector.head shouldBe 10.0f - - val w6 = w1 / 2.0f - w6.eval.toFloatVector.head shouldBe 2.0f - val w7 = w2 - 2.0f - w7.eval.toFloatVector.head shouldBe 6.0f - - val w8 = 2.0f / w1 - w8.eval.toFloatVector.head shouldBe 2.0f - - val w9 = 2.0f - w2 - w9.eval.toFloatVector.head shouldBe 6.0f - } - - "SameDiff" should "provide arithmetic operations for double arguments in arbitrary order" in { - implicit val sd = SameDiff.create() - val w1 = sd.bind("w1", 4.0.toScalar) - var evaluated = w1.eval.castTo(DataType.DOUBLE) - evaluated.toFloatVector.head shouldBe 4.0 - - val w2 = w1 * 2.0 - w2.eval.toFloatVector.head shouldBe 8.0 - val w3 = w2 + 2.0 - w3.eval.toFloatVector.head shouldBe 10.0 - - val w4 = 2.0 * w1 - w4.eval.toFloatVector.head shouldBe 8.0 - val w5 = 2.0 + w2 - w5.eval.toFloatVector.head shouldBe 10.0 - - val w6 = w1 / 2.0 - w6.eval.toFloatVector.head shouldBe 2.0 - val w7 = w2 - 2.0 - w7.eval.toFloatVector.head shouldBe 6.0 - - val w8 = 2.0 / w1 - w8.eval.toFloatVector.head shouldBe 2.0 - val w9 = 2.0 - w2 - w9.eval.toFloatVector.head shouldBe 6.0f - } - - "SameDiff" should "provide floor division" in { - implicit val sd = SameDiff.create() - val w1 = sd.bind("w1", 4.0.toScalar) - val w2 = sd.bind("w2", 1.2.toScalar) - val w3 = w1 `//` w2 - w3.eval.toFloatVector.head shouldBe 3.0 - - val w4 = w1 `//` 1.5 - w4.eval.toFloatVector.head shouldBe 2.0 - - val w5 = 9.5 `//` w1 - w5.eval.toFloatVector.head shouldBe 2.0 - } - - "SameDiff" should "provide remainder division" in { - implicit val sd = SameDiff.create() - val w1 = sd.bind("w1", 40.0.toScalar) - val w2 = sd.bind("w2", 12.0.toScalar) - val w3 = w2 % w1 - w3.eval.toFloatVector.head shouldBe 12.0 - val w4 = w1 % w2 - w4.eval.toFloatVector.head shouldBe 4.0 - - val w5 = w1 % 15.0 - w5.eval.toFloatVector.head shouldBe 10.0 - - val w6 = 10.0 % w1 - w6.eval.toFloatVector.head shouldBe 10.0 - } - - "SameDiff" should "provide unary math operators" in { - implicit val sd = SameDiff.create() - val w1 = sd.bind("w1", 4.0.toScalar) - var evaluated = w1.eval.castTo(DataType.DOUBLE) - evaluated.toFloatVector.head shouldBe 4.0 - - val w2 = -w1 - var evaluated2 = w2.eval.castTo(DataType.DOUBLE) - evaluated2.toFloatVector.head shouldBe -4.0 - - val w3 = w1 ** 2 - var evaluated3 = w3.eval.castTo(DataType.DOUBLE) - evaluated3.toFloatVector.head shouldBe 16.0 - } - - "SameDiff" should "provide boolean logic operators" in { - implicit val sd = SameDiff.create() - val w1 = sd.constant(Nd4j.scalar(true)) - val w2 = sd.constant(Nd4j.scalar(true)) - - val w3 = w1 | w2 - w3.eval.toIntVector.head shouldBe 1 - - val w4 = w1 & w2 - w4.eval.toIntVector.head shouldBe 1 - - val w5 = w1 ^ w2 - w5.eval.toIntVector.head shouldBe 0 - - val w6 = w1 | false - w6.eval.toIntVector.head shouldBe 1 - - val w7 = w1 & false - w7.eval.toIntVector.head shouldBe 0 - - val w8 = w1 ^ false - w8.eval.toIntVector.head shouldBe 1 - - val w9 = false | w1 - w9.eval.toIntVector.head shouldBe 1 - - val w10 = false & w1 - w10.eval.toIntVector.head shouldBe 0 - - val w11 = false ^ w1 - w11.eval.toIntVector.head shouldBe 1 - } - - "SameDiff" should "provide shifting operations" in { - implicit val sd = SameDiff.create() - val w1 = sd.constant(16) - - val w2 = w1 << 2 - w2.eval.toIntVector.head shouldBe 64 - - val w3 = w1 >> 2 - w3.eval.toIntVector.head shouldBe 4 - } - - "SameDiff" should "provide shifting operations with SDVariable argument" in { - implicit val sd = SameDiff.create() - val w1 = sd.constant(16) - val two = sd.constant(2) - - val w2 = w1 << two - w2.eval.toIntVector.head shouldBe 64 - - val w3 = w1 >> two - w3.eval.toIntVector.head shouldBe 4 - } - - "SDVariable " should "be indexable" in { - implicit val sd = SameDiff.create - - val arr = Nd4j.linspace(1, 100, 100).reshape('c', 10L, 10L) - val x = sd.bind(arr) - val y = new SDVariableWrapper(x) - - x.get(SDIndex.point(0)).eval shouldBe y(0).eval - } - - "SDVariable " should "be indexable in 2d" in { - implicit val sd = SameDiff.create - - val arr = Nd4j.linspace(DataType.FLOAT, 1.0, 1.0, 9).reshape(3, 3) - - val x = sd.bind(arr) - - x(0, ---).eval shouldBe x(SDIndex.point(0), SDIndex.all()).eval - - val slice1 = x.get(SDIndex.interval(0L, 2L), SDIndex.all()).eval - val slice2 = x(0 :: 2, ---).eval - slice1 shouldBe slice2 - } - - "SDVariable " should "be indexable in 3d" in { - implicit val sd = SameDiff.create - - val arr = Nd4j.linspace(DataType.FLOAT, 1.0, 1.0, 18).reshape(3, 3, 2) - val x = sd.bind(arr) - - x.get(SDIndex.all(), SDIndex.all(), SDIndex.all()).eval shouldBe x(---, ---, ---).eval - x.get(SDIndex.point(0), SDIndex.all(), SDIndex.all()).eval shouldBe x(0, ---, ---).eval - x.get(SDIndex.point(0), SDIndex.point(0), SDIndex.all()).eval shouldBe x(0, 0, ---).eval - x.get(SDIndex.point(0), SDIndex.point(0), SDIndex.point(0)).eval shouldBe x(0, 0, 0).eval - - x.get(SDIndex.interval(0L, 2L), SDIndex.point(0), SDIndex.point(0)).eval shouldBe x(0 :: 2, 0, 0).eval - x.get(SDIndex.interval(0L, 2L), SDIndex.interval(0L, 1L), SDIndex.interval(0L, 2L)).eval shouldBe x(0 :: 2, - 0 :: 1, - 0 :: 2).eval - x.get(SDIndex.interval(0L, 2L), SDIndex.interval(0L, 1L), SDIndex.all()).eval shouldBe x(0 :: 2, 0 :: 1, ---).eval - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/SameDiffTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/SameDiffTest.scala deleted file mode 100644 index c4a5fd0d7..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/SameDiffTest.scala +++ /dev/null @@ -1,144 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4s.samediff - -import java.lang.reflect.Field -import java.util -import java.util.{ Arrays, Collections, HashMap, List, Map } - -import org.nd4j.shade.guava.collect.{ Lists, Maps } -import org.junit.Assert._ -import org.junit.Assume.assumeNotNull -import org.nd4j.autodiff.samediff._ -import org.nd4j.autodiff.samediff.impl.DefaultSameDiffConditional -import org.nd4j.autodiff.validation.{ OpValidation, TestCase } -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.api.blas.params.MMulTranspose -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.api.ops.DynamicCustomOp -import org.nd4j.linalg.api.ops.impl.layers.ExternalErrorsFunction -import org.nd4j.linalg.api.ops.impl.layers.convolution.config.{ Conv2DConfig, LocalResponseNormalizationConfig } -import org.nd4j.linalg.api.ops.impl.reduce3.ManhattanDistance -import org.nd4j.linalg.api.ops.impl.shape.tensorops.TensorArray -import org.nd4j.linalg.api.ops.impl.transforms.any.IsMax -import org.nd4j.linalg.api.ops.impl.transforms.custom.{ Max, Min } -import org.nd4j.linalg.api.ops.impl.transforms.custom._ -import org.nd4j.linalg.api.ops.random.impl.BernoulliDistribution -import org.nd4j.linalg.api.shape.LongShapeDescriptor -import org.nd4j.linalg.checkutil.NDArrayCreationUtil -import org.nd4j.linalg.dataset.{ DataSet, MultiDataSet } -import org.nd4j.linalg.dataset.adapter.SingletonMultiDataSetIterator -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.indexing.NDArrayIndex -import org.nd4j.linalg.indexing.NDArrayIndex.all -import org.nd4j.linalg.learning.config.Adam -import org.nd4j.linalg.ops.transforms.Transforms -import org.nd4j.weightinit.impl.{ OneInitScheme, UniformInitScheme, ZeroInitScheme } -import org.nd4s.samediff.implicits.Implicits._ -import org.scalatest.{ FlatSpec, Matchers } -import scala.collection.JavaConversions._ - -class SameDiffTest extends FlatSpec with Matchers { - - "SameDiff" should "allow Mse backwards execution" in { - - implicit val sd: SameDiff = SameDiff.create - - val nOut: Int = 4 - val minibatch: Int = 3 - val input: SDVariable = sd.bind("in", DataType.FLOAT, Array[Long](minibatch, nOut)) - val label: SDVariable = sd.bind("label", DataType.FLOAT, Array[Long](minibatch, nOut)) - - val diff: SDVariable = input - label - val sqDiff: SDVariable = diff * diff - //val sqDiff: SDVariable = diff ** 2 - val msePerEx: SDVariable = sd.mean("msePerEx", sqDiff, 1) - val avgMSE: SDVariable = sd.mean("loss", msePerEx, 0) - - val inputArr: INDArray = Nd4j.rand(DataType.FLOAT, minibatch, nOut) - val labelArr: INDArray = Nd4j.rand(DataType.FLOAT, minibatch, nOut) - - sd.associateArrayWithVariable(inputArr, input) - sd.associateArrayWithVariable(labelArr, label) - - val result = sd.output(null: java.util.Map[String, org.nd4j.linalg.api.ndarray.INDArray], "loss") - assertEquals(1, result.values().size()) - - val emptyMap = new HashMap[String, INDArray]() - sd.output(emptyMap, "loss") - } - - "SameDiff" should "run test dense layer forward pass" in { - Nd4j.getRandom.setSeed(12345) - implicit val sd = SameDiff.create - val iInput = Nd4j.rand(3, 4) - val iWeights = Nd4j.rand(4, 5) - val iBias = Nd4j.rand(1, 5) - val input = sd.bind("input", iInput) - val weights = sd.bind("weights", iWeights) - val bias = sd.bind("bias", iBias) - val mmul = sd.mmul("mmul", input, weights) - - val z = mmul + bias - - val out = sd.nn.sigmoid("out", z) - val expMmul = iInput.mmul(iWeights) - val expZ = expMmul.addRowVector(iBias) - val expOut = Transforms.sigmoid(expZ, true) - sd.output(new HashMap[String, INDArray](), "mmul", "out", "bias", "add") - assertEquals(expMmul, mmul.getArr) - assertEquals(expZ, z.getArr) - assertEquals(expOut, out.getArr) - } - - "SameDiff" should "convert placeholder to constant" in { - Nd4j.getRandom.setSeed(12345) - val sd = SameDiff.create - val in = sd.placeHolder("in", DataType.FLOAT, 1, 3) - val in2 = sd.placeHolder("in2", DataType.FLOAT, 3, 4) - val b = sd.bind("b", Nd4j.rand(DataType.FLOAT, 1, 4)) - val mmul = in.mmul(in2) - val add = mmul + b - val tanh = sd.math.tanh(add) - val loss = sd.variance(tanh, true) - val inArr = Nd4j.rand(DataType.FLOAT, 1, 3) - in.setArray(inArr) - val inArr2 = Nd4j.rand(DataType.FLOAT, 3, 4) - val c = TrainingConfig.builder - .updater(new Adam(0.1)) - .weightDecay(0.01, true) - .dataSetFeatureMapping("in", "in2") - .skipBuilderValidation(true) - .build - - val data = new HashMap[String, INDArray]() - data.put("in", Nd4j.randn(1, 3)) - data.put("in2", Nd4j.randn(3, 4)) - in.convertToConstant - val out = sd.output(data, "tanh") - val out2 = sd.output(data, "tanh") - assertEquals(out, out2) - assertEquals(VariableType.CONSTANT, in.getVariableType) - assertEquals(inArr, in.getArr) - //Sanity check on fitting: - sd.setTrainingConfig(c) - sd.fit(new SingletonMultiDataSetIterator(new MultiDataSet(Array[INDArray](inArr, inArr2), null)), 1) - } -} diff --git a/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/TrainingTest.scala b/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/TrainingTest.scala deleted file mode 100644 index d38d0aeb5..000000000 --- a/contrib/attic/nd4s/src/test/scala/org/nd4s/samediff/TrainingTest.scala +++ /dev/null @@ -1,143 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.nd4s.samediff - -import org.nd4j.autodiff.samediff.{ SDVariable, SameDiff, TrainingConfig } -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.dataset.{ DataSet, MultiDataSet } -import org.nd4j.linalg.dataset.adapter.SingletonMultiDataSetIterator -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.learning.config.Adam -import org.nd4s.Implicits._ -import org.nd4s.samediff.implicits.Implicits._ -import org.scalatest.{ FlatSpec, Matchers } - -class TrainingTest extends FlatSpec with Matchers { - - "SameDiff" should "allow loss calculation" in { - for (i <- 0 until 2) { - implicit val sd = SameDiff.create - val ph = sd.placeHolder("ph", DataType.FLOAT, 3, 4) - val w = sd.bind("w", Nd4j.rand(DataType.FLOAT, 4, 5)) - val b = sd.bind("b", Nd4j.rand(DataType.FLOAT, 5)) - val mmul = ph.mmul(w) - val badd = mmul + b - val add = badd + 1 - val shape = add.shape - val unused1 = ph.mul(2) - val unused2 = ph.sub(4) - val unused3 = unused1.div(unused2) - val loss1 = add.std("l1", true) - val loss2 = mmul.mean("l2") -// Console.println(sd.summary) - if (i == 0) { - sd.setLossVariables("l1", "l2") - sd.createGradFunction() - } else { - val tc = TrainingConfig.builder - .updater(new Adam(0.01)) - .minimize("l1", "l2") - .dataSetFeatureMapping("ph") - .markLabelsUnused - .build - sd.setTrainingConfig(tc) - val ds = new DataSet(Nd4j.create(3, 4), null) - sd.fit(ds) - sd.fit(ds) - } - for (s <- Array[String]("w", "b", badd.getVarName, add.getVarName, "l1", "l2")) { - val gradVar = sd.getVariable(s).gradient - assert(gradVar != null) - } - //Unused: - assert(!shape.hasGradient) - try assert(shape.gradient == null) - catch { - case e: IllegalStateException => - assert(e.getMessage.contains("only floating point variables")) - } - for (s <- Array[String](unused1.getVarName, unused2.getVarName, unused3.getVarName)) { - assert(sd.getVariable(s).gradient == null) - } - } - } - - "SameDiff" should "allow creating and running model with 2 losses: train on the first one, then change losses" in { - // TODO: try to get rid of implicit here - implicit val sd = SameDiff.create - val ph1 = sd.placeHolder("ph1", DataType.FLOAT, 3, 4) - val w1 = sd.bind("w1", Nd4j.rand(DataType.FLOAT, 4, 5)) - val b1 = sd.bind("b1", Nd4j.rand(DataType.FLOAT, 5)) - val mmul1 = ph1.mmul(w1) - val badd1 = mmul1 + b1 - - val ph2 = sd.placeHolder("ph2", DataType.FLOAT, 3, 2) - val w2 = sd.bind("w2", Nd4j.rand(DataType.FLOAT, 2, 6)) - val b2 = sd.bind("b2", Nd4j.rand(DataType.FLOAT, 6)) - val mmul2 = ph2.mmul(w2) - val badd2 = mmul2 + b2 - val loss1 = badd1.std("loss1", true) - val loss2 = badd2.std("loss2", true) - //First: create grad function for optimizing loss 1 only - sd.setLossVariables("loss1") - sd.createGradFunction() - for (v <- Array[SDVariable](ph1, w1, b1, mmul1, badd1, loss1)) { - assert(v.gradient != null) - } - for (v <- Array[SDVariable](ph2, w2, b2, mmul2, badd2, loss2)) { - assert(v.gradient == null) - } - //Now, set to other loss function - sd.setLossVariables("loss2") - sd.createGradFunction() - for (v <- Array[SDVariable](ph1, w1, b1, mmul1, badd1, loss1)) { - assert(v.gradient == null) - } - for (v <- Array[SDVariable](ph2, w2, b2, mmul2, badd2, loss2)) { - assert(v.gradient != null) - } - //Train the first side of the graph. The other side should remain unmodified! - sd.setLossVariables("loss1") - var w1Before = w1.getArr.dup - var b1Before = b1.getArr.dup - var w2Before = w2.getArr.dup - var b2Before = b2.getArr.dup - val tc = TrainingConfig.builder.updater(new Adam(1e-2)).dataSetFeatureMapping("ph1", "ph2").markLabelsUnused.build - sd.setTrainingConfig(tc) - val mds = new MultiDataSet(Array[INDArray](Nd4j.rand(DataType.FLOAT, 3, 4), Nd4j.rand(DataType.FLOAT, 3, 2)), - new Array[INDArray](0)) - sd.fit(new SingletonMultiDataSetIterator(mds), 3) - assert(w1Before != w1.getArr) - assert(b1Before != b1.getArr) - assert(w2Before == w2.getArr) - assert(b2Before == b2.getArr) - //Train second side of graph; first side should be unmodified - sd.setLossVariables("loss2") - w1Before = w1.getArr.dup - b1Before = b1.getArr.dup - w2Before = w2.getArr.dup - b2Before = b2.getArr.dup - sd.fit(new SingletonMultiDataSetIterator(mds), 3) - assert(w1Before == w1.getArr) - assert(b1Before == b1.getArr) - assert(w2Before != w2.getArr) - assert(b2Before != b2.getArr) - } -} diff --git a/contrib/attic/pydatavec/.gitignore b/contrib/attic/pydatavec/.gitignore deleted file mode 100644 index 6b7a57cff..000000000 --- a/contrib/attic/pydatavec/.gitignore +++ /dev/null @@ -1,65 +0,0 @@ -# Byte-compiled / optimized / DLL files -__pycache__/ -*.py[cod] -*$py.class - -# C extensions -*.so - -# Distribution / packaging -.Python -env/ -build/ -develop-eggs/ -dist/ -downloads/ -eggs/ -.eggs/ -lib/ -lib64/ -parts/ -sdist/ -var/ -*.egg-info/ -.installed.cfg -*.egg - -# PyInstaller -# Usually these files are written by a python script from a template -# before PyInstaller builds the exe, so as to inject date/other infos into it. -*.manifest -*.spec - -# Installer logs -pip-log.txt -pip-delete-this-directory.txt - -# Unit test / coverage reports -htmlcov/ -.tox/ -.coverage -.coverage.* -.cache -nosetests.xml -coverage.xml -*,cover -.hypothesis/ - -# Translations -*.mo -*.pot - -# Django stuff: -*.log - -# Sphinx documentation -docs/_build/ - -# PyBuilder -target/ - -#Ipython Notebook -.ipynb_checkpoints - -# IDE settings -.idea/ diff --git a/contrib/attic/pydatavec/README.md b/contrib/attic/pydatavec/README.md deleted file mode 100644 index 4e3253447..000000000 --- a/contrib/attic/pydatavec/README.md +++ /dev/null @@ -1,30 +0,0 @@ -# PyDataVec : Python interface for DataVec - -[![Join the chat at https://gitter.im/deeplearning4j/deeplearning4j](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/deeplearning4j/deeplearning4j?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) -[![PyPI version](https://badge.fury.io/py/pydatavec.svg)](https://badge.fury.io/py/pydatavec) - -## Installation - -```bash -pip install pydatavec -``` - -## Examples - -Examples are in the [dl4j-examples repo](https://www.github.com/eclipse/deeplearning4j-examples) - -Clone dl4j-examples: - -```bash -git clone https://www.github.com/eclipse/deeplearning4j-examples.git -``` - -Run examples in `pydatavec-examples` directory - -```bash -cd deeplearning4j-examples/pydatavec-examples -python basic.py -python iris.py -python reduction.py -``` - diff --git a/contrib/attic/pydatavec/pom.xml b/contrib/attic/pydatavec/pom.xml deleted file mode 100644 index c7459a922..000000000 --- a/contrib/attic/pydatavec/pom.xml +++ /dev/null @@ -1,165 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j - 1.0.0-SNAPSHOT - - - pydatavec - - pydatavec - - - false - 0.1 - - - - - - org.apache.maven.plugins - maven-shade-plugin - ${maven-shade-plugin.version} - - - package - - shade - - - - - org.deeplearning4j.example.App - - - - - - - - org.apache.maven.plugins - maven-compiler-plugin - - - org.apache.maven.plugins - maven-jar-plugin - ${maven-jar-plugin.version} - - true - - - - empty-javadoc-jar - package - - jar - - - javadoc - ${basedir}/javadoc - - - - empty-sources-jar - package - - jar - - - sources - ${basedir}/src - - - - - - org.codehaus.mojo - exec-maven-plugin - ${exec-maven-plugin.version} - - - python-install-cython - install - - exec - - - pip - ${basedir} - - install - --user - Cython - --install-option=--no-cython-compile - - - - - python-build - install - - exec - - - pip - ${basedir} - - install - --user - -e - .[tests, spark] - - - - - python-test - test - - exec - - - python - ${basedir} - ${pydatavec.test.skip} - - -m - pytest - --pep8 - -m - pep8 - tests/ - - - - - - - - diff --git a/contrib/attic/pydatavec/pydatavec/__init__.py b/contrib/attic/pydatavec/pydatavec/__init__.py deleted file mode 100644 index 2656f3425..000000000 --- a/contrib/attic/pydatavec/pydatavec/__init__.py +++ /dev/null @@ -1,30 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .conditions import * -from .schema import * -from .transform_process import * -from .utils import * -from .executors import SparkExecutor diff --git a/contrib/attic/pydatavec/pydatavec/conditions.py b/contrib/attic/pydatavec/pydatavec/conditions.py deleted file mode 100644 index 2bbfca9b7..000000000 --- a/contrib/attic/pydatavec/pydatavec/conditions.py +++ /dev/null @@ -1,78 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -class Condition(object): - - @property - def name(self): - return self.__class__.__name__ - - -class InSet(Condition): - def __init__(self, column, set): - self.column = column - self.set = set - - -class NotInSet(Condition): - def __init__(self, column, set): - self.column = column - self.set = set - - -class Equals(Condition): - def __init__(self, column, value): - self.column = column - self.value = value - - -class NotEquals(Condition): - def __init__(self, column, value): - self.column = column - self.value = value - - -class LessThan(Condition): - def __init__(self, column, value): - self.column = column - self.value = value - - -class LessThanOrEqual(Condition): - def __init__(self, column, value): - self.column = column - self.value = value - - -class GreaterThan(Condition): - def __init__(self, column, value): - self.column = column - self.value = value - - -class GreaterThanOrEqual(Condition): - def __init__(self, column, value): - self.column = column - self.value = value diff --git a/contrib/attic/pydatavec/pydatavec/executors/__init__.py b/contrib/attic/pydatavec/pydatavec/executors/__init__.py deleted file mode 100644 index 099ee4381..000000000 --- a/contrib/attic/pydatavec/pydatavec/executors/__init__.py +++ /dev/null @@ -1,27 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from .spark import SparkExecutor -from .local import LocalExecutor diff --git a/contrib/attic/pydatavec/pydatavec/executors/local.py b/contrib/attic/pydatavec/pydatavec/executors/local.py deleted file mode 100644 index 114b94ccf..000000000 --- a/contrib/attic/pydatavec/pydatavec/executors/local.py +++ /dev/null @@ -1,86 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import os - - -class Writable(object): - - def __init__(self, j_w): - self.j_w = j_w - - def save_to_csv(self, path): - from ..java_classes import NumberOfRecordsPartitioner - from ..java_classes import CSVRecordWriter - from ..java_classes import FileSplit, JFile - - output_file = JFile(path) - if output_file.exists(): - output_file.delete() - output_file.createNewFile() - rw = CSVRecordWriter() - rw.initialize(FileSplit(output_file), NumberOfRecordsPartitioner()) - rw.writeBatch(self.j_w) - rw.close() - - def save(self, path): - self.save_to_csv(path) - - def __iter__(self): - rows = [] - nr = self.j_w.size() - nc = self.j_w.get(0).size() if nr else 0 - for i in range(nr): - row = self.j_w.get(i) - cols = [row.get(j).toString() for j in range(nc)] - rows.append(cols) - return iter(rows) - - def iter(self): - return self.__iter__() - - -class LocalExecutor(object): - - def __init__(self): - from ..java_classes import CSVRecordReader - self.rr = CSVRecordReader(0, ',') - - pass - - def __call__(self, tp, source): - from ..java_classes import CSVRecordReader, WritablesToStringFunction, StringToWritablesFunction - from ..java_classes import FileSplit, JFile, ArrayList, LocalTransformExecutor - - tp = tp.to_java() - assert type(source) is str - assert os.path.isfile(source) - f = JFile(source) - rr = self.rr - rr.initialize(FileSplit(f)) - data = ArrayList() - while rr.hasNext(): - data.add(rr.next()) - processed_data = LocalTransformExecutor.execute(data, tp) - return Writable(processed_data) diff --git a/contrib/attic/pydatavec/pydatavec/executors/spark.py b/contrib/attic/pydatavec/pydatavec/executors/spark.py deleted file mode 100644 index 47ad220d4..000000000 --- a/contrib/attic/pydatavec/pydatavec/executors/spark.py +++ /dev/null @@ -1,99 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import os -import logging - -_JVM_RUNNING = False - - -class StringRDD(object): - - def __init__(self, java_rdd): - self.java_rdd = java_rdd - - def __iter__(self): - jlist = self.java_rdd.collect() - size = jlist.size() - return iter([jlist.get(i) for i in range(size)]) - - def iter(self): - return self.__iter__() - - def save(self, path): - self.java_rdd.saveAsTextFile(path) - - def save_to_csv(self, path): - l = list(self) - with open(path, 'w') as f: - for x in l: - f.write(x + '\n') - - -class SparkExecutor(object): - - def __init__(self, master='local[*]', app_name='pydatavec'): - global _JVM_RUNNING - if not _JVM_RUNNING: - from ..java_classes import SparkConf, SparkContext, SparkTransformExecutor - from ..java_classes import CSVRecordReader, WritablesToStringFunction, StringToWritablesFunction - _JVM_RUNNING = True - spark_conf = SparkConf() - spark_conf.setMaster(master) - spark_conf.setAppName(app_name) - self.spark_context = SparkContext(spark_conf) - self.rr = CSVRecordReader() - self.executor = SparkTransformExecutor - self.str2wf = StringToWritablesFunction - self.w2strf = WritablesToStringFunction - - def __call__(self, tp, source): - source_type = getattr(type(source), '__name__', None) - if source_type == 'str': - if os.path.isfile(source) or os.path.isdir(source): - string_data = self.spark_context.textFile( - source) # JavaRDD - else: - raise ValueError('Invalid source ' + source) - elif source_type == 'org.apache.spark.api.java.JavaRDD': - string_data = source - elif source_type.endswith('RDD'): - tempid = 0 - path = 'temp_0' - while(os.path.isdir(path)): - tempid += 1 - path = 'temp_' + str(tempid) - logging.info('Converting pyspark RDD to JavaRDD...') - source.saveAsTextFile(path) - string_data = self.spark_context.textFile(path) - else: - raise Exception('Unexpected source type: ' + str(type(source))) - parsed_input_data = string_data.map( - self.str2wf(self.rr)) # JavaRDD> - processed_data = self.executor.execute( - parsed_input_data, tp.to_java()) # JavaRDD> - processed_as_string = processed_data.map( - self.w2strf(",")) # JavaRDD - return StringRDD(processed_as_string) # StringRDD diff --git a/contrib/attic/pydatavec/pydatavec/java_classes.py b/contrib/attic/pydatavec/pydatavec/java_classes.py deleted file mode 100644 index 1f11e2982..000000000 --- a/contrib/attic/pydatavec/pydatavec/java_classes.py +++ /dev/null @@ -1,129 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import jnius_config -import os -import warnings -import pydl4j - -pydl4j.validate_datavec_jars() - - -# -------------JVM starts here------------- -from jnius import autoclass - -JString = autoclass("java.lang.String") -JSchema = autoclass('org.datavec.api.transform.schema.Schema') -SchemaBuilder = autoclass('org/datavec/api/transform/schema/Schema$Builder') - -JTransformProcess = autoclass('org.datavec.api.transform.TransformProcess') -TransformProcessBuilder = autoclass( - 'org/datavec/api/transform/TransformProcess$Builder') - -ConditionOp = autoclass('org.datavec.api.transform.condition.ConditionOp') -ConditionFilter = autoclass('org.datavec.api.transform.filter.ConditionFilter') - -BooleanColumnCondition = autoclass( - 'org.datavec.api.transform.condition.column.BooleanColumnCondition') -CategoricalColumnCondition = autoclass( - 'org.datavec.api.transform.condition.column.CategoricalColumnCondition') -DoubleColumnCondition = autoclass( - 'org.datavec.api.transform.condition.column.DoubleColumnCondition') -StringColumnCondition = autoclass( - 'org.datavec.api.transform.condition.column.StringColumnCondition') - - -BooleanWritable = autoclass('org.datavec.api.writable.BooleanWritable') -IntegerWritable = autoclass('org.datavec.api.writable.IntWritable') -LongWritable = autoclass('org.datavec.api.writable.LongWritable') -FloatWritable = autoclass('org.datavec.api.writable.FloatWritable') -DoubleWritable = autoclass('org.datavec.api.writable.DoubleWritable') - - -DateTimeZone = autoclass('org.joda.time.DateTimeZone') -DateTimeFieldType = autoclass('org.joda.time.DateTimeFieldType') -DeriveColumnsFromTimeTransformBuilder = autoclass( - 'org.datavec.api.transform.transform.time.DeriveColumnsFromTimeTransform$Builder') - - -Arrays = autoclass('java.util.Arrays') -HashSet = autoclass('java.util.HashSet') - - -JDouble = autoclass('java.lang.Double') -JFloat = autoclass('java.lang.Float') - -Arrays = autoclass('java.util.Arrays') -JMap = autoclass('java.util.HashMap') - -try: - SparkConf = autoclass('org.apache.spark.SparkConf') - SparkContext = autoclass('org.apache.spark.api.java.JavaSparkContext') - JavaRDD = autoclass('org.apache.spark.api.java.JavaRDD') - SparkTransformExecutor = autoclass( - 'org.datavec.spark.transform.SparkTransformExecutor') - StringToWritablesFunction = autoclass( - 'org.datavec.spark.transform.misc.StringToWritablesFunction') - WritablesToStringFunction = autoclass( - 'org.datavec.spark.transform.misc.WritablesToStringFunction') - spark_available = True -except: - spark_available = False - -CSVRecordReader = autoclass( - 'org.datavec.api.records.reader.impl.csv.CSVRecordReader') -CSVRecordWriter = autoclass( - 'org.datavec.api.records.writer.impl.csv.CSVRecordWriter') - -LocalTransformExecutor = autoclass( - 'org.datavec.local.transforms.LocalTransformExecutor') - -ChangeCaseStringTransform = autoclass( - 'org.datavec.api.transform.transform.string.ChangeCaseStringTransform') -ChangeCaseStringTransformCaseType = autoclass( - 'org.datavec.api.transform.transform.string.ChangeCaseStringTransform$CaseType') -ConcatenateStringColumns = autoclass( - 'org.datavec.api.transform.transform.string.ConcatenateStringColumns') -RemoveWhiteSpaceTransform = autoclass( - 'org.datavec.api.transform.transform.string.RemoveWhiteSpaceTransform') -ReplaceEmptyStringTransform = autoclass( - 'org.datavec.api.transform.transform.string.ReplaceEmptyStringTransform') -ReplaceStringTransform = autoclass( - 'org.datavec.api.transform.transform.string.ReplaceStringTransform') -StringMapTransform = autoclass( - 'org.datavec.api.transform.transform.string.StringMapTransform') - - -ReducerBuilder = autoclass('org.datavec.api.transform.reduce.Reducer$Builder') -ReduceOp = autoclass('org.datavec.api.transform.ReduceOp') - - -FileSplit = autoclass('org.datavec.api.split.FileSplit') - -JFile = autoclass('java.io.File') -ArrayList = autoclass('java.util.ArrayList') - -NumberOfRecordsPartitioner = autoclass( - 'org.datavec.api.split.partition.NumberOfRecordsPartitioner') diff --git a/contrib/attic/pydatavec/pydatavec/schema.py b/contrib/attic/pydatavec/pydatavec/schema.py deleted file mode 100644 index fb6f70580..000000000 --- a/contrib/attic/pydatavec/pydatavec/schema.py +++ /dev/null @@ -1,110 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from collections import OrderedDict - - -class Schema(object): - - def __init__(self): - self.columns = OrderedDict() - - def add_column(self, column_type, column_name, *args): - if column_name in self.columns: - raise Exception( - "Column names should be unique. Another column with name " + column_name + " already exists.") - self.columns[column_name] = [column_type] + list(args) - - def add_string_column(self, column): - self.add_column("string", column) - - def add_integer_column(self, column, *args): - self.add_column("integer", column, *args) - - def add_long_column(self, column, *args): - self.add_column("long", column, *args) - - def add_float_column(self, column, *args): - self.add_column("float", column, *args) - - def add_double_column(self, column, *args): - self.add_column("double", column, *args) - - def add_categorical_column(self, column, categories): - self.add_column("categorical", column, *categories) - - def get_column_type(self, column): - return self.columns[column][0] - - def serialize(self): - config = {} - meta = [] - col_names = [] - for k in self.columns: - meta.append(self.columns[k]) - col_names.append(k) - config['column_names'] = col_names - config['meta'] = meta - return config - - @classmethod - def deserialize(cls, config): - schema = cls() - col_names = config['column_names'] - meta = config['meta'] - for c, m in zip(col_names, meta): - schema.columns[c] = m - return schema - - def to_java(self): - from .java_classes import SchemaBuilder, JString, JFloat, JDouble - builder = SchemaBuilder() - for c in self.columns: - meta = self.columns[c] - col_type = meta[0] - col_name = c - col_args = meta[1:] - if col_type == "string": - builder.addColumnString(JString(col_name)) - elif col_type == "categorical": - col_args = [JString(arg) for arg in col_args] - builder.addColumnCategorical(JString(col_name), *col_args) - else: - # numerical data - num_type = col_type[0].upper() + col_type[1:] - f = getattr(builder, 'addColumn' + num_type) - col_args = list(col_args) - if num_type in ('Float', 'Double'): - java_type = eval('J' + num_type) - for i, a in enumerate(col_args): - if type(a) in [int, float]: - col_args[i] = java_type(a) - f(col_name, *col_args) - return builder.build() - - def copy(self): - config = str(self.serialize()) - clone = Schema.deserialize(eval(config)) - return clone diff --git a/contrib/attic/pydatavec/pydatavec/transform_process.py b/contrib/attic/pydatavec/pydatavec/transform_process.py deleted file mode 100644 index 936415f5e..000000000 --- a/contrib/attic/pydatavec/pydatavec/transform_process.py +++ /dev/null @@ -1,455 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -from collections import OrderedDict -from .conditions import * -from .schema import Schema -import warnings -import logging -from .java_classes import JString - - -def _dq(x): - return "JString(\"" + x.replace("\"", "\\\"") + "\")" - - -def _to_camel(x, first_upper=False): - tokens = x.split('_') - if first_upper: - y = '' - for t in tokens: - y += t[0].upper() + t[1:] - else: - y = tokens[0] - for t in tokens[1:]: - y += t[0].upper() + t[1:] - return y - - -def _dict_to_jmap(d, JMap): - jmap = JMap() - for k, v in d.items(): - jmap.put(k, v) - return jmap - - -class TransformProcess(object): - - def __init__(self, schema, inplace=True): - self.schema = schema - self.final_schema = schema.copy() - self.steps = [] - self.executors = {} - self.inplace = inplace - - def add_step(self, step, *args): - self.steps.append((step,) + args) - - def remove_column(self, *columns): - if len(columns) == 1: - columns = columns[0] - if type(columns) in (list, tuple): - self.add_step("removeColumns", *columns) - for c in columns: - del self.final_schema.columns[c] - else: - self.add_step("removeColumns", columns) - del self.final_schema.columns[columns] - else: - self.add_step("removeColumns", *columns) - for c in columns: - del self.final_schema.columns[c] - if not self.inplace: - return self - - def remove_columns_except(self, *columns): - if len(columns) == 1: - columns = columns[0] - if type(columns) in (list, tuple): - self.add_step("removeAllColumnsExceptFor", *columns) - to_del = [] - for c in self.final_schema.columns: - if c not in columns: - to_del.append(c) - for c in to_del: - del self.final_schema.columns[c] - else: - self.add_step("removeAllColumnsExceptFor", columns) - to_del = [] - for c in self.final_schema.columns: - if c != columns: - to_del.append(c) - for c in to_del: - del self.final_schema.columns[c] - else: - self.add_step("removeAllColumnsExceptFor", *columns) - to_del = [] - for c in self.final_schema.columns: - if c not in columns: - to_del.append(c) - for c in to_del: - del self.final_schema.columns[c] - if not self.inplace: - return self - - def filter(self, condition): - col_name = condition.column - col_type = self.final_schema.get_column_type(col_name) - col_type = col_type[0].upper() + col_type[1:] - if condition.name in ("InSet", "NotInSet"): - code = "filter(ConditionFilter({}ColumnCondition({}, ConditionOp.{}, HashSet(Arrays.asList({})))))" - code = code.format(col_type, _dq(col_name), condition.name, ','.join( - [_dq(x) for x in condition.set])) - else: - code = "filter(ConditionFilter({}ColumnCondition({}, ConditionOp.{}, {})" - code = code.format(col_type, _dq(col_name), - condition.name, condition.value) - self.add_step("exec", code) - if not self.inplace: - return self - - def replace(self, column, value, condition): - # there are 2 columns involved - # the column whose content we are replacing - # and the column against which the condition is written - column1_type = self.final_schema.get_column_type(column) - column1_type = column1_type[0].upper() + column1_type[1:] - column2 = condition.column - column2_type = self.final_schema.get_column_type(column2) - column2_type = column2_type[0].upper() + column2_type[1:] - if condition.name in ("InSet", "NotInSet"): - code = "conditionalReplaceValueTransform({}, {}Writable({}), {}ColumnCondition({}, ConditionOp.{}, HashSet(Arrays.asList({}))))" - code = code.format(_dq(column), column1_type, value, column2_type, _dq( - column2), condition.name, ','.join([_dq(x) for x in condition.set])) - else: - code = "conditionalReplaceValueTransform({}, {}Writable({}), {}ColumnCondition({}, ConditionOp.{}, {}))" - code = code.format(_dq(column), column1_type, value, column2_type, _dq( - column2), condition.name, condition.value) - self.add_step("exec", code) - if not self.inplace: - return self - - def rename_column(self, column, new_name): - new_d = OrderedDict() - old_d = self.final_schema.columns - for k in old_d: - if k == column: - new_d[new_name] = old_d[k] - else: - new_d[k] = old_d[k] - self.final_schema.columns = new_d - self.add_step("renameColumn", JString(column), JString(new_name)) - if not self.inplace: - return self - - def string_to_time(self, column, format="YYY-MM-DD HH:mm:ss.SSS", time_zone="UTC"): - self.final_schema.columns[column][0] = "DateTime" - py_string = "stringToTimeTransform({}, {}, {})".format(_dq(column), _dq(format), "DateTimeZone." + time_zone) - self.add_step("exec", py_string) - if not self.inplace: - return self - - def derive_column_from_time(self, source_column, new_column, field): - code = 'transform(DeriveColumnsFromTimeTransformBuilder({}).addIntegerDerivedColumn({}, DateTimeFieldType.{}()).build())' - code = code.format(_dq(source_column), _dq( - new_column), _to_camel(field)) - self.add_step("exec", code) - self.final_schema.add_column("integer", new_column) - if not self.inplace: - return self - - def categorical_to_integer(self, column): - if self.final_schema.columns[column][0] != 'categorical': - raise Exception('Can not apply categorical_to_integer' - ' transform on column \"{}\" because it is not a categorcal column.'.format(column)) - self.final_schema.columns[column][0] = 'integer' - self.add_step('categoricalToInteger', column) - if not self.inplace: - return self - - def append_string(self, column, string): - if self.final_schema.columns[column][0] != 'string': - raise Exception( - 'Can not apply append_string transform to column {} because it is not a string column'.format(column)) - self.add_step('appendStringColumnTransform', JString(column), JString(string)) - if not self.inplace: - return self - - def lower(self, column): - if self.final_schema.columns[column][0] != 'string': - raise Exception( - 'Can not apply lower transform to column {} because it is not a string column'.format(column)) - self.add_step( - 'exec', 'transform(ChangeCaseStringTransform({}, ChangeCaseStringTransformCaseType.LOWER))'.format(_dq(column))) - if not self.inplace: - return self - - def upper(self, column): - if self.final_schema.columns[column][0] != 'string': - raise Exception( - 'Can not apply lower transform to column {} because it is not a string column'.format(column)) - self.add_step( - 'exec', 'transform(ChangeCaseStringTransform({}, ChangeCaseStringTransformCaseType.UPPER))'.format(_dq(column))) - if not self.inplace: - return self - - def concat(self, columns, new_column=None, delimiter=','): - for column in columns: - if self.final_schema.columns[column][0] != 'string': - raise Exception( - 'Can not apply concat transform to column {} because it is not a string column'.format(column)) - if new_column is None: - new_column = 'concat({})'.format(','.join(columns)) - if new_column in self.final_schema.columns: - raise Exception( - 'Another column with name {} already exists.'.format(new_column)) - columns = [_dq(c) for c in columns] - self.final_schema.add_string_column(new_column) - self.add_step('exec', 'transform(ConcatenateStringColumns({}, {}, Arrays.asList({})))'.format( - _dq(new_column), _dq(delimiter), ', '.join(columns))) - if not self.inplace: - return self - - def remove_white_spaces(self, column): - if self.final_schema.columns[column][0] != 'string': - raise Exception( - 'Can not apply remove_white_spaces transform to column {} because it is not a string column'.format(column)) - self.add_step( - 'exec', 'transform(RemoveWhiteSpaceTransform({}))'.format(_dq(column))) - if not self.inplace: - return self - - def replace_empty_string(self, column, value): - if self.final_schema.columns[column][0] != 'string': - raise Exception( - 'Can not apply replace_empty_string transform to column {} because it is not a string column'.format(column)) - self.add_step('exec', 'transform(ReplaceEmptyStringTransform({}, {}))'.format( - _dq(column), _dq(value))) - if not self.inplace: - return self - - def replace_string(self, column, *args): - if self.final_schema.columns[column][0] != 'string': - raise Exception( - 'Can not apply replace_string transform to column {} because it is not a string column'.format(column)) - if len(args) == 1: - args = args[0] - assert type( - args) is dict, 'Invalid argument. Possible signatures are replace(str, str, str) and replace(str, dict)' - elif len(args) == 2: - assert type(args[0]) == str and type( - args[1]) == str, 'Invalid argument. Possible signatures are replace(str, str, str) and replace(str, dict)' - args = {args[0]: args[1]} - else: - raise Exception( - 'Invalid argument. Possible signatures are replace(str, str, str) and replace(str, dict)') - self.add_step('exec', 'transform(ReplaceStringTransform({}, _dict_to_jmap({}, JMap)))'.format( - _dq(column), str(args))) - if not self.inplace: - return self - - def map_string(self, column, mapping): - if self.final_schema.columns[column][0] != 'string': - raise Exception( - 'Can not apply replace_string transform to column {} because it is not a string column'.format(column)) - self.add_step('exec', 'transform(StringMapTransform({}, _dict_to_jmap({}, JMap)))'.format( - _dq(column), str(mapping))) - if not self.inplace: - return self - - def one_hot(self, column): - if self.final_schema.columns[column][0] != 'categorical': - raise Exception( - 'Can not apply one_hot transform to column {} because it is not a categorical column'.format(column)) - categories = self.final_schema.columns[column][2:] - new_col_names = [column + '[{}]'.format(cat) for cat in categories] - new_schema = OrderedDict() - for k in self.final_schema.columns: - if k == column: - for c in new_col_names: - new_schema[c] = ['integer'] - else: - new_schema[k] = self.final_schema.columns[k] - self.final_schema.columns = new_schema - self.add_step('categoricalToOneHot', column) - if not self.inplace: - return self - - def reduce(self, key, *args, **kwargs): - # possible signatures: - # tp.reduce(column_name, default_redcution) # example: tp.reduce('person', 'sum') # sums all columns - # tp.reduce(column, {'amount' : 'sum', 'hours' : 'mean'}) # Explicit reduction for each columns - # tp.reduce(column, 'sum', {'hours' : 'mean'}) # Explicit reduction for some columns, default reduction for others - # tp.reduce(column, 'sum', 'hours'='mean') # kwargs instead of dict - if type(key) is str: - key = [key] - else: - key = list(key) - non_key_columns = [ - x for x in self.final_schema.columns if x not in key] - col_2_reduction = {} - if args: - if type(args[0]) is dict: - default = None - col_2_reduction = args[0] - else: - default = args[0] - if len(args) > 1: - assert type(args[1]) == dict, 'Expected dict' - col_2_reduction = args[1] - else: - col_2_reduction = kwargs - else: - default = None - col_2_reduction = kwargs - reductions = ['min', 'max', 'sum', 'prod', 'mean', 'std', 'uncorrected_std', - 'var', 'pop_var', 'count', 'range', 'count_unique', 'first', 'last', - 'append', 'prepend'] - if default is None: - for k in non_key_columns: - assert k in col_2_reduction, "Reduction not specified for column {}.".format( - k) - else: - assert default in reductions, "Invalid default reduction {}. Valid redcutions are {}.".format( - default, reductions) - for k, v in col_2_reduction.items(): - assert v in reductions, "Invalid redcution {} specified for column {}. Valid reductions are {}.".format( - v, k, reductions) - reduction_to_function = {'std': 'stdevColumns', 'uncorrected_std': 'uncorrectedStdevColumns', 'var': 'variance', - 'pop_var': 'populationVariance', 'first': 'takeFirstColumns', 'last': 'takeLastColumns', 'max': 'maxColumn'} - if default is None: - default = col_2_reduction[list(col_2_reduction.keys())[0]] - reduction_to_op = {'std': 'Stdev', 'uncorrected_std': 'UncorrectedStdDev', 'var': 'Variance', 'pop_var': 'PopulationVariance', - 'first': 'TakeFirst', 'last': 'TakeLast'} - default_op = reduction_to_op.get(default, _to_camel(default, True)) - col_2_function = {} - for k, v in col_2_reduction.items(): - f = reduction_to_function.get(v, _to_camel(v) + 'Columns') - col_2_function[k] = f - code = 'reduce(ReducerBuilder(ReduceOp.{}).keyColumns({})'.format( - default_op, ','.join([_dq(k) for k in key])) - for c, f in col_2_function.items(): - code += ".{}({})".format(f, _dq(c)) - code += '.build())' - self.add_step('exec', code) - reduction_to_type = {} - for r in ['mean', 'std', 'var', 'pop_var', 'uncorrected_std']: - reduction_to_type[r] = 'double' - for r in ['append', 'prepend']: - reduction_to_type[r] = 'string' - for r in ['count', 'count_unique']: - reduction_to_type[r] = 'long' - new_schema = OrderedDict() - for k, v in self.final_schema.columns.items(): - if k in key: - new_schema[k] = v - else: - reduction = col_2_reduction.get(k, default) - old_type = v[0] - op = reduction_to_op.get(reduction, _to_camel(default, True)) - new_name = op.lower() + '(' + k + ')' - new_type = reduction_to_type.get(reduction, old_type) - new_schema[k] = [new_type, new_name] - self.final_schema.columns = new_schema - if not self.inplace: - return self - - def serialize(self): - config = {'steps': self.steps, 'schema': self.schema.serialize()} - return config - - @classmethod - def deserialize(cls, config): - schema = Schema.deserialize(config['schema']) - tp = cls(schema) - tp.steps = config['steps'][:] - return tp - - # TODO from_java is used in konduit a lot - def to_java(self): - from .java_classes import TransformProcessBuilder - from .java_classes import ConditionOp - from .java_classes import ConditionFilter - from .java_classes import BooleanColumnCondition - from .java_classes import CategoricalColumnCondition - from .java_classes import DoubleColumnCondition - #from .java_classes import FloatColumnCondition - from .java_classes import StringColumnCondition - from .java_classes import DateTimeZone - from .java_classes import DeriveColumnsFromTimeTransformBuilder - from .java_classes import Arrays, HashSet - from .java_classes import BooleanWritable - from .java_classes import IntegerWritable - from .java_classes import LongWritable - from .java_classes import FloatWritable - from .java_classes import DoubleWritable - from .java_classes import DateTimeFieldType - from .java_classes import ChangeCaseStringTransform - from .java_classes import ChangeCaseStringTransformCaseType - from .java_classes import ConcatenateStringColumns - from .java_classes import RemoveWhiteSpaceTransform - from .java_classes import ReplaceEmptyStringTransform - from .java_classes import ReplaceStringTransform - from .java_classes import StringMapTransform - from .java_classes import JMap - from .java_classes import Arrays - from .java_classes import ReducerBuilder - from .java_classes import ReduceOp - from .java_classes import JString - - jschema = self.schema.to_java() - builder = TransformProcessBuilder(jschema) - for step in self.steps: - if step[0] == "exec": - code = step[1] - logging.info(code) - exec("builder." + code) - else: - f = getattr(builder, step[0]) - f(*step[1:]) - return builder.build() - - def __call__(self, csv, executor='spark'): - try: - executor = self.executors[executor] - except: - if executor == 'spark': - from .java_classes import spark_available - if not spark_available: - warnings.warn( - 'Spark not available. Running local executor instead.') - from .executors import LocalExecutor - executor = LocalExecutor() - self.executors['local'] = executor - self.executors['spark'] = executor - else: - from .executors import SparkExecutor - executor = SparkExecutor() - self.executors['spark'] = executor - if executor == 'local': - from .executors import LocalExecutor - executor = LocalExecutor() - self.executors['local'] = executor - return executor(self, csv) diff --git a/contrib/attic/pydatavec/pydatavec/utils.py b/contrib/attic/pydatavec/pydatavec/utils.py deleted file mode 100644 index 1a2d0413b..000000000 --- a/contrib/attic/pydatavec/pydatavec/utils.py +++ /dev/null @@ -1,194 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -import os -import requests -import sys -import time -import math -import logging -import warnings - -def _mean(x): - s = float(sum(x)) - s /= len(x) - return s - - -class ProgressBar(object): - """Displays a progress bar. - - # Arguments - target: Total number of steps expected, None if unknown. - interval: Minimum visual progress update interval (in seconds). - """ - - def __init__(self, target, width=30, verbose=1, interval=0.05): - self.width = width - if target is None: - target = -1 - self.target = target - self.sum_values = {} - self.unique_values = [] - self.start = time.time() - self.last_update = 0 - self.interval = interval - self.total_width = 0 - self.seen_so_far = 0 - self.verbose = verbose - - def set_value(self, current, values=None, force=False): - values = values or [] - for k, v in values: - if k not in self.sum_values: - self.sum_values[k] = [v * (current - self.seen_so_far), - current - self.seen_so_far] - self.unique_values.append(k) - else: - self.sum_values[k][0] += v * (current - self.seen_so_far) - self.sum_values[k][1] += (current - self.seen_so_far) - self.seen_so_far = current - - now = time.time() - if self.verbose == 1: - if not force and (now - self.last_update) < self.interval: - return - - prev_total_width = self.total_width - sys.stdout.write('\b' * prev_total_width) - sys.stdout.write('\r') - - if self.target is not -1: - numdigits = int(math.floor(math.log(self.target, 10))) + 1 - barstr = '%%%dd/%%%dd [' % (numdigits, numdigits) - bar = barstr % (current, self.target) - prog = float(current) / self.target - prog_width = int(self.width * prog) - if prog_width > 0: - bar += ('=' * (prog_width - 1)) - if current < self.target: - bar += '>' - else: - bar += '=' - bar += ('.' * (self.width - prog_width)) - bar += ']' - sys.stdout.write(bar) - self.total_width = len(bar) - - if current: - time_per_unit = (now - self.start) / current - else: - time_per_unit = 0 - eta = time_per_unit * (self.target - current) - perc = float(current) * 100 / self.target - info = '' - if current < self.target and self.target is not -1: - info += ' - %f%%' % perc - info += ' - ETA: %ds' % eta - else: - info += ' - %ds' % (now - self.start) - for k in self.unique_values: - info += ' - %s:' % k - if isinstance(self.sum_values[k], list): - avg = _mean( - self.sum_values[k][0] / max(1, self.sum_values[k][1])) - if abs(avg) > 1e-3: - info += ' %.4f' % avg - else: - info += ' %.4e' % avg - else: - info += ' %s' % self.sum_values[k] - - self.total_width += len(info) - if prev_total_width > self.total_width: - info += ((prev_total_width - self.total_width) * ' ') - - sys.stdout.write(info) - sys.stdout.flush() - - if current >= self.target: - sys.stdout.write('\n') - - if self.verbose == 2: - if current >= self.target: - info = '%ds' % (now - self.start) - for k in self.unique_values: - info += ' - %s:' % k - avg = _mean( - self.sum_values[k][0] / max(1, self.sum_values[k][1])) - if avg > 1e-3: - info += ' %.4f' % avg - else: - info += ' %.4e' % avg - sys.stdout.write(info + "\n") - - self.last_update = now - - def update(self, n=1, values=None): - self.set_value(self.seen_so_far + n, values) - - -def download_file(url, file_name): - r = requests.get(url, stream=True) - file_size = int(r.headers['Content-length']) - ''' - if py3: - file_size = int(u.getheader("Content-Length")[0]) - else: - file_size = int(u.info().getheaders("Content-Length")[0]) - ''' - file_exists = False - if os.path.isfile(file_name): - local_file_size = os.path.getsize(file_name) - if local_file_size == file_size: - file_exists = True - else: - warnings.warn("File corrupt. Downloading again.") - os.remove(file_name) - if not file_exists: - factor = int(math.floor(math.log(file_size)/math.log(1024))) - display_file_size = str(file_size / 1024 ** factor) + \ - ['B', 'KB', 'MB', 'GB', 'TB', 'PB'][factor] - logging.info("Source: " + url) - logging.info("Destination " + file_name) - logging.info("Size: " + display_file_size) - file_size_dl = 0 - block_sz = 8192 - f = open(file_name, 'wb') - pbar = ProgressBar(file_size) - for chunk in r.iter_content(chunk_size=block_sz): - if not chunk: - continue - chunk_size = len(chunk) - file_size_dl += chunk_size - f.write(chunk) - pbar.update(chunk_size) - #status = r"%10d [%3.2f%%]" % (file_size_dl, file_size_dl * 100. / file_size) - #status = status + chr(8)*(len(status)+1) - # print(status) - f.close() - else: - logging.info("File already exists - " + file_name) - return True diff --git a/contrib/attic/pydatavec/pytest.ini b/contrib/attic/pydatavec/pytest.ini deleted file mode 100644 index bca914c69..000000000 --- a/contrib/attic/pydatavec/pytest.ini +++ /dev/null @@ -1,10 +0,0 @@ -[pytest] - -norecursedirs= build - -# PEP-8 The following are ignored: -# E501 line too long (82 > 79 characters) -# W503 line break occurred before a binary operator - -pep8ignore=* E501 \ - * W503 \ No newline at end of file diff --git a/contrib/attic/pydatavec/release.sh b/contrib/attic/pydatavec/release.sh deleted file mode 100644 index 32f4a2be5..000000000 --- a/contrib/attic/pydatavec/release.sh +++ /dev/null @@ -1,35 +0,0 @@ -#!/bin/bash - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - - - - - -# remove old wheels -sudo rm -rf dist/* - -# Build Python 2 & 3 wheels for current version -sudo python2 setup.py sdist bdist_wheel -sudo python3 setup.py sdist bdist_wheel - -# Upload to PyPI with twine. Needs full "skymind" credentials in ~/.pypirc -twine upload dist/* \ No newline at end of file diff --git a/contrib/attic/pydatavec/setup.py b/contrib/attic/pydatavec/setup.py deleted file mode 100644 index a4358c53e..000000000 --- a/contrib/attic/pydatavec/setup.py +++ /dev/null @@ -1,58 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from setuptools import setup -from setuptools import find_packages - -setup(name='pydatavec', - version='0.1.2', - description='Python interface for DataVec', - long_description='Python interface for DataVec', - - classifiers=[ - 'Development Status :: 3 - Alpha', - 'License :: OSI Approved :: Apache Software License', - 'Operating System :: OS Independent', - 'Programming Language :: Python', - 'Programming Language :: Python :: 2', - 'Programming Language :: Python :: 3', - 'Topic :: Software Development :: Libraries' - ], - keywords='python java datavec etl deeplearning4j', - url='https://github.com/eclipse/deeplearning4j.git', - license='Apache', - setup_requires=['Cython', 'pytest-runner'], - install_requires=[ - 'Cython', - 'requests', - 'pydl4j', - 'numpy<=1.16.4', # For compatibility with python 2 - ], - extras_require={ - 'spark': ['pyspark'], - 'tests': ['pytest', - 'pytest-pep8', - 'mock'], - }, - packages=find_packages()) diff --git a/contrib/attic/pydatavec/tests/basic_example.csv b/contrib/attic/pydatavec/tests/basic_example.csv deleted file mode 100644 index 4373a1491..000000000 --- a/contrib/attic/pydatavec/tests/basic_example.csv +++ /dev/null @@ -1,7 +0,0 @@ -2016-01-01 17:00:00.000,830a7u3,u323fy8902,1,USA,100.00,Legit -2016-01-01 18:03:01.256,830a7u3,9732498oeu,3,FR,73.20,Legit -2016-01-03 02:53:32.231,78ueoau32,w234e989,1,USA,1621.00,Fraud -2016-01-03 09:30:16.832,t842uocd,9732498oeu,4,USA,43.19,Legit -2016-01-04 23:01:52.920,t842uocd,cza8873bm,10,MX,159.65,Legit -2016-01-05 02:28:10.648,t842uocd,fgcq9803,6,CAN,26.33,Fraud -2016-01-05 10:15:36.483,rgc707ke3,tn342v7,2,USA,-0.90,Legit diff --git a/contrib/attic/pydatavec/tests/test_reduce.py b/contrib/attic/pydatavec/tests/test_reduce.py deleted file mode 100644 index 66e53b525..000000000 --- a/contrib/attic/pydatavec/tests/test_reduce.py +++ /dev/null @@ -1,119 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -from pydatavec import Schema, TransformProcess - - -def test_reduce_1(): - reductions = ['sum', 'mean', 'std', 'var', 'prod'] - for red in reductions: - schema = Schema() - schema.add_string_column('name') - schema.add_double_column('amount') - schema.add_integer_column('hours') - - tp = TransformProcess(schema) - tp.reduce('name', red) - - tp.to_java() - - -def test_reduce_2(): - reductions = ['sum', 'mean', 'std', 'var', 'prod'] - for red1 in reductions: - for red2 in reductions: - schema = Schema() - schema.add_string_column('name') - schema.add_double_column('amount') - schema.add_integer_column('hours') - - tp = TransformProcess(schema) - tp.reduce('name', red1, {'amount': red2}) - - tp.to_java() - - -def test_reduce_3(): - reductions = ['sum', 'mean', 'std', 'var', 'prod'] - for red1 in reductions: - for red2 in reductions: - schema = Schema() - schema.add_string_column('name') - schema.add_double_column('amount') - schema.add_integer_column('hours') - - tp = TransformProcess(schema) - tp.reduce('name', {'amount': red1, 'hours': red2}) - - tp.to_java() - - -def test_reduce_4(): - reductions = ['first', 'last', 'append', - 'prepend', 'count', 'count_unique'] - for red in reductions: - schema = Schema() - schema.add_string_column('col1') - schema.add_string_column('col2') - - tp = TransformProcess(schema) - tp.reduce('col1', red) - - tp.to_java() - - -def test_reduce_5(): - reductions = ['first', 'last', 'append', - 'prepend', 'count', 'count_unique'] - for red1 in reductions: - for red2 in reductions: - schema = Schema() - schema.add_string_column('col1') - schema.add_string_column('col2') - schema.add_string_column('col3') - - tp = TransformProcess(schema) - tp.reduce('col1', red1, {'col3': red2}) - tp.to_java() - - -def test_reduce_6(): - reductions = ['first', 'last', 'append', - 'prepend', 'count', 'count_unique'] - for red1 in reductions: - for red2 in reductions: - schema = Schema() - schema.add_string_column('col1') - schema.add_string_column('col2') - schema.add_string_column('col3') - - tp = TransformProcess(schema) - tp.reduce('col1', {'col2': red1, 'col3': red2}) - - tp.to_java() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydatavec/tests/test_schema.py b/contrib/attic/pydatavec/tests/test_schema.py deleted file mode 100644 index bcc9908c1..000000000 --- a/contrib/attic/pydatavec/tests/test_schema.py +++ /dev/null @@ -1,45 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -from pydatavec import Schema - - -def test_schema(): - schema = Schema() - schema.add_string_column('str1') - schema.add_string_column('str2') - schema.add_integer_column('int1') - schema.add_integer_column('int2') - schema.add_double_column('dbl1') - schema.add_double_column('dbl2') - schema.add_float_column('flt1') - schema.add_float_column('flt2') - schema.add_categorical_column('cat1', ['A', 'B', 'C']) - schema.add_categorical_column('cat2', ['A', 'B', 'C']) - schema.to_java() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydatavec/tests/test_transform_process.py b/contrib/attic/pydatavec/tests/test_transform_process.py deleted file mode 100644 index 76081b01d..000000000 --- a/contrib/attic/pydatavec/tests/test_transform_process.py +++ /dev/null @@ -1,174 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -from pydatavec import Schema, TransformProcess - - -def test_rename(): - schema = Schema() - schema.add_string_column('str1') - - tp = TransformProcess(schema) - tp.rename_column('str1', 'str2') - - assert 'str1' not in tp.final_schema.columns - assert 'str2' in tp.final_schema.columns - - tp.to_java() - - -def test_remove(): - schema = Schema() - schema.add_string_column('str1') - schema.add_string_column('str2') - - tp = TransformProcess(schema) - tp.remove_column('str1') - - assert list(tp.final_schema.columns.keys()) == ['str2'] - - tp.to_java() - - -def test_remove_except(): - schema = Schema() - schema.add_string_column('str1') - schema.add_string_column('str2') - schema.add_string_column('str3') - - tp = TransformProcess(schema) - tp.remove_columns_except('str2') - - assert list(tp.final_schema.columns.keys()) == ['str2'] - - tp.to_java() - - -def test_str_to_time(): - schema = Schema() - schema.add_string_column('str1') - schema.add_string_column('str2') - - tp = TransformProcess(schema) - - tp.string_to_time('str1') - - assert tp.final_schema.get_column_type('str1') == 'DateTime' - - tp.to_java() - - -def test_derive_col_from_time(): - schema = Schema() - schema.add_string_column('str1') - schema.add_string_column('str2') - - tp = TransformProcess(schema) - - tp.string_to_time('str1') - tp.derive_column_from_time('str1', 'hour', 'hour_of_day') - - assert 'hour' in tp.final_schema.columns - - tp.to_java() - - -def test_cat_to_int(): - schema = Schema() - schema.add_categorical_column('cat', ['A', 'B', 'C']) - - tp = TransformProcess(schema) - tp.categorical_to_integer('cat') - - assert tp.final_schema.get_column_type('cat') == 'integer' - - tp.to_java() - - -def test_append_string(): - schema = Schema() - schema.add_string_column('str1') - - tp = TransformProcess(schema) - tp.append_string('str1', 'xxx') - - tp.to_java() - - -def test_lower(): - schema = Schema() - schema.add_string_column('str1') - - tp = TransformProcess(schema) - tp.lower('str1') - - tp.to_java() - - -def test_upper(): - schema = Schema() - schema.add_string_column('str1') - - tp = TransformProcess(schema) - tp.upper('str1') - - tp.to_java() - - -def test_concat(): - schema = Schema() - schema.add_string_column('str1') - schema.add_string_column('str2') - - tp = TransformProcess(schema) - tp.concat(['str1', 'str2'], 'str3') - - assert 'str3' in tp.final_schema.columns - - tp.to_java() - - -def test_remove_white_spaces(): - schema = Schema() - schema.add_string_column('str1') - - tp = TransformProcess(schema) - tp.remove_white_spaces('str1') - - tp.to_java() - - -def test_replace_empty(): - schema = Schema() - schema.add_string_column('str1') - - tp = TransformProcess(schema) - tp.replace_empty_string('str1', 'xx') - - tp.to_java() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydl4j/.gitignore b/contrib/attic/pydl4j/.gitignore deleted file mode 100644 index f173b9aba..000000000 --- a/contrib/attic/pydl4j/.gitignore +++ /dev/null @@ -1,107 +0,0 @@ -# Byte-compiled / optimized / DLL files -__pycache__/ -*.py[cod] -*$py.class - -# C extensions -*.so - -# Distribution / packaging -.Python -build/ -develop-eggs/ -dist/ -downloads/ -eggs/ -.eggs/ -lib/ -lib64/ -parts/ -sdist/ -var/ -wheels/ -*.egg-info/ -.installed.cfg -*.egg -MANIFEST - -# PyInstaller -# Usually these files are written by a python script from a template -# before PyInstaller builds the exe, so as to inject date/other infos into it. -*.manifest -*.spec - -# Installer logs -pip-log.txt -pip-delete-this-directory.txt - -# Unit test / coverage reports -htmlcov/ -.tox/ -.coverage -.coverage.* -.cache -nosetests.xml -coverage.xml -*.cover -.hypothesis/ -.pytest_cache/ - -# Translations -*.mo -*.pot - -# Django stuff: -*.log -local_settings.py -db.sqlite3 - -# Flask stuff: -instance/ -.webassets-cache - -# Scrapy stuff: -.scrapy - -# Sphinx documentation -docs/_build/ - -# PyBuilder -target/ - -# Jupyter Notebook -.ipynb_checkpoints - -# pyenv -.python-version - -# celery beat schedule file -celerybeat-schedule - -# SageMath parsed files -*.sage.py - -# Environments -.env -.venv -env/ -venv/ -ENV/ -env.bak/ -venv.bak/ - -# Spyder project settings -.spyderproject -.spyproject - -# Rope project settings -.ropeproject - -# mkdocs documentation -/site - -# mypy -.mypy_cache/ - -# Intellij -pydl4j.iml \ No newline at end of file diff --git a/contrib/attic/pydl4j/README.md b/contrib/attic/pydl4j/README.md deleted file mode 100644 index 2211b019f..000000000 --- a/contrib/attic/pydl4j/README.md +++ /dev/null @@ -1,188 +0,0 @@ -# PyDL4J - Java dependency management for Python applications - -[![Join the chat at https://gitter.im/deeplearning4j/deeplearning4j](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/deeplearning4j/deeplearning4j?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) -[![PyPI version](https://badge.fury.io/py/pydl4j.svg)](https://badge.fury.io/py/pydl4j) - -PyDL4J is a lightweight package manager for the DL4J ecosystem which allows you to focus -on building Python applications on top of `pyjnius` without worrying about the details. You -can use PyDL4J for the following tasks: - -- Automatically manage JARs for your Python projects, such as `jumpy` or `pydatavec`. -- Configure your Python DL4J environment through the PyDL4J command line interface, -- Use PyDL4J as a replacement for Maven for basic tasks, from Python. - ---------- - - -# Installation - -PyDL4J is on PyPI, so you can install it with `pip`: - -```bash -pip install pydl4j -``` - -Alternatively, you can build the project locally as follows: - -```bash -git clone https://www.github.com/eclipse/deeplearning4j.git -cd deeplearning4j/pydl4j -python setup.py install -``` - -As regular user, this will likely be enough for your needs. In fact, most of the time you -will not interact with PyDL4J directly at all. All other Python projects maintained by -Skymind use PyDL4J under the hood and will install this dependency for you. - -# PyDL4J command line interface (CLI) - -Installing PyDL4J exposes a command line tool called `pydl4j`. You can use this tool to configure -your PyDL4J environment. If you don't use the CLI, a default configuration will be used instead. - -**Note:** If you intend to use the CLI, make sure to have [`docker` installed](https://docs.docker.com/install/) -on your machine. - -To initialize a new PyDL4J configuration, type - -```bash -pydl4j init - - -██████╗ ██╗ ██╗██████╗ ██╗██╗ ██╗ ██╗ -██╔══██╗╚██╗ ██╔╝██╔══██╗██║██║ ██║ ██║ -██████╔╝ ╚████╔╝ ██║ ██║██║███████║ ██║ -██╔═══╝ ╚██╔╝ ██║ ██║██║╚════██║██ ██║ -██║ ██║ ██████╔╝███████╗██║╚█████╔╝ -╚═╝ ╚═╝ ╚═════╝ ╚══════╝╚═╝ ╚════╝ - -pydl4j is a system to manage your DL4J dependencies from Python! - -Which DL4J version do you want to use for your Python projects? (default '1.0.0-beta2'): -``` - -Follow the instructions provided by the CLI. At the end of this process you'll see a -JSON object carrying your configuration. - -```bash -This is your current settings file config.json: - -{ - "dl4j_core": true, - "nd4j_backend": "cpu", - "spark_version": "2", - "datavec": false, - "spark": true, - "scala_version": "2.11", - "dl4j_version": "1.0.0-beta2" -} - -Does this look good? (default 'y')[y/n]: - -``` - -If not configured otherwise, this configuration file will be stored at `~/.deeplearning4j/pydl4j/config.json`. This -configuration file is a lightweight version for Python users to avoid the cognitive load of the widely used -Project Object Model (POM) widely used in Java. PyDL4J will translate your configuration into the right format -internally to provide you with the tools you need. - -Finally, to install the Java dependencies configured in your `config.json` you use the following command: - -```bash -pydl4j install -``` - -This tool will install all necessary JARs into `~/.deeplearning4j/pydl4j` for you, by running `mvn` in a -Docker container, and setting your classpath so that your `pyjnius` Python applications can access them. - -# PyDL4J API - -# Example - -```python -import pydl4j -import jnius_config -from pydl4j import mvn - -pydl4j.set_context('my_python_app_name') - -# Fetch latest version of datavec.datavec-api from Maven central -pydl4j.mvn_install(group='datavec', artifact='datavec-api') - -# Or fetch a specific version: -pydl4j.mvn_install(group='datavec', artifact='datavec-api', - version='1.0.0-beta') - -jnius_config.set_classpath(pydl4j.get_dir()) -``` - -# List all artifacts in a group - -```python -mvn.get_artifacts(group_id) -``` - -# Example - -```python -mvn.get_artifacts('datavec') -``` - -```bash -['datavec-api', 'datavec-arrow', 'datavec-camel', 'datavec-cli', 'datavec-data', 'datavec-data-audio', 'datavec-data-codec', 'datavec-d - ata-image', 'datavec-data-nlp', 'datavec-dataframe', 'datavec-excel', 'datavec-geo', 'datavec-hadoop', 'datavec-jdbc', 'datavec-local', - 'datavec-nd4j-common', 'datavec-parent', 'datavec-perf', 'datavec-spark-inference-client', 'datavec-spark-inference-model', 'datavec-s - park-inference-parent', 'datavec-spark-inference-server_2.10', 'datavec-spark-inference-server_2.11', 'datavec-spark_2.10', 'datavec-sp - ark_2.11'] -``` - -# List all versions of an artifact - -```python -mvn.get_versions(group_id, artifact_id) -``` - -# Example - -```python -mvn.get_versions('datavec', 'datavec-api') -``` - -```bash -['0.4.0', '0.5.0', '0.6.0', '0.7.0', '0.7.1', '0.7.2', '0.8.0', - '0.9.0', '0.9.1', '1.0.0-alpha', '1.0.0-beta', '1.0.0-beta2'] -``` - -# Get latest version of an artifact - -```python -mvn.get_latest_version(group_id, artifact_id) -``` - -# Example - -```python -mvn.get_latest_version('datavec', 'datavec-api') -``` - -```bash -'1.0.0-beta2' -``` - -# List all installed jars - -```python -pydl4j.get_jars() -``` - -# Uninstall a jar - -```python -# Find jar name from pydl4j.get_jars() -pydl4j.uninstall(jar_name) -``` - -# Uninstall all jars: - -```python -pydl4j.clear_context() -``` diff --git a/contrib/attic/pydl4j/pom.xml b/contrib/attic/pydl4j/pom.xml deleted file mode 100644 index 645629988..000000000 --- a/contrib/attic/pydl4j/pom.xml +++ /dev/null @@ -1,188 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j - 1.0.0-SNAPSHOT - - - pydl4j - - pydl4j - - - false - 0.1.3 - nd4j-native - - - - - org.nd4j - ${nd4j.backend} - ${dl4j.version} - - - - - - - org.apache.maven.plugins - maven-shade-plugin - ${maven-shade-plugin.version} - - - package - - shade - - - - - org.deeplearning4j.example.App - - - - - - - - org.apache.maven.plugins - maven-compiler-plugin - - - org.apache.maven.plugins - maven-jar-plugin - ${maven-jar-plugin.version} - - true - - - - empty-javadoc-jar - package - - jar - - - javadoc - ${basedir}/javadoc - - - - empty-sources-jar - package - - jar - - - sources - ${basedir}/src - - - - - - org.codehaus.mojo - exec-maven-plugin - ${exec-maven-plugin.version} - - - python-install-cython - install - - exec - - - pip - ${basedir} - - install - --user - Cython - --install-option=--no-cython-compile - - - - - python-build - install - - exec - - - pip - ${basedir} - - install - --user - -e - .[tests] - - - - - python-test - test - - exec - - - python - ${basedir} - ${pydl4j.test.skip} - - -m - pytest - --pep8 - -m - pep8 - tests/ - - - - - - - - - - - nd4j-backend - - - libnd4j.cuda - - - - nd4j-cuda-11.0 - - - - diff --git a/contrib/attic/pydl4j/pydl4j/__init__.py b/contrib/attic/pydl4j/pydl4j/__init__.py deleted file mode 100644 index 344600568..000000000 --- a/contrib/attic/pydl4j/pydl4j/__init__.py +++ /dev/null @@ -1,27 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from .pydl4j import * -from .jarmgr import * -from .mvn import * diff --git a/contrib/attic/pydl4j/pydl4j/cli.py b/contrib/attic/pydl4j/pydl4j/cli.py deleted file mode 100644 index ce1423134..000000000 --- a/contrib/attic/pydl4j/pydl4j/cli.py +++ /dev/null @@ -1,216 +0,0 @@ -#!/usr/bin python - -# -*- coding: utf-8 -*- - -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import argparse -import json -import os -import sys -import pkg_resources -import argcomplete -import traceback -import subprocess -import click -from click.exceptions import ClickException -from dateutil import parser - -from .pydl4j import set_config, get_config -from .pydl4j import validate_config, is_docker_available -from .pydl4j import _maven_build - - -if sys.version_info[0] == 2: - input = raw_input - - -_CONFIG = get_config() - -DEFAULT_DL4J_VERSION = _CONFIG['dl4j_version'] -DEFAULT_BACKEND = _CONFIG['nd4j_backend'] -DEFAULT_DATAVEC = _CONFIG['datavec'] -DEFAULT_SPARK = _CONFIG['spark'] -DEFAULT_SPARK_MAJOR = _CONFIG['spark_version'] -DEFAULT_SCALA_VERSION = _CONFIG['scala_version'] -DEFAULT_SPARK_DETAILS = 'y' - - -def to_bool(string): - if type(string) is bool: - return string - return True if string[0] in ["Y", "y"] else False - - -class CLI(object): - - def __init__(self): - self.var_args = None - self.command = None - - def command_dispatcher(self, args=None): - desc = ('pydl4j, a system to manage your DL4J dependencies from Python.\n') - parser = argparse.ArgumentParser(description=desc) - parser.add_argument( - '-v', '--version', action='version', - version=pkg_resources.get_distribution("pydl4j").version, - help='Print pydl4j version' - ) - - subparsers = parser.add_subparsers(title='subcommands', dest='command') - subparsers.add_parser('init', help='Initialize pydl4j') - subparsers.add_parser('install', help='Install jars for pydl4j') - - argcomplete.autocomplete(parser) - args = parser.parse_args(args) - self.var_args = vars(args) - - if not args.command: - parser.print_help() - return - - self.command = args.command - - if self.command == 'init': - self.init() - return - - if self.command == 'install': - self.install() - return - - def init(self): - - click.echo(click.style(u"""\n██████╗ ██╗ ██╗██████╗ ██╗██╗ ██╗ ██╗ -██╔══██╗╚██╗ ██╔╝██╔══██╗██║██║ ██║ ██║ -██████╔╝ ╚████╔╝ ██║ ██║██║███████║ ██║ -██╔═══╝ ╚██╔╝ ██║ ██║██║╚════██║██ ██║ -██║ ██║ ██████╔╝███████╗██║╚█████╔╝ -╚═╝ ╚═╝ ╚═════╝ ╚══════╝╚═╝ ╚════╝ \n""", fg='blue', bold=True)) - - click.echo(click.style("pydl4j", bold=True) + - " is a system to manage your DL4J dependencies from Python!\n") - - # DL4J version - dl4j_version = input("Which DL4J version do you want to use for your Python projects? (default '%s'): " % - DEFAULT_DL4J_VERSION) or DEFAULT_DL4J_VERSION - # TODO: check if input is valid - - # ND4J backend - backend = input("Which backend would you like to use ('cpu' or 'gpu')? (default '%s'): " % - DEFAULT_BACKEND) or DEFAULT_BACKEND - backend = backend.lower() - - # DataVec usage - datavec = input( - "Do you need DL4J DataVec for ETL? (default 'y') [y/n]: ") or DEFAULT_DATAVEC - datavec = to_bool(datavec) - - # DL4J core usage - DEFAULT_DL4J = 'y' - dl4j_core = input( - "Do you want to work with DeepLearning4J from Python? (default 'y') [y/n]: ") or DEFAULT_DL4J - dl4j_core = to_bool(dl4j_core) - - # Spark - spark = input( - "Do you need Spark for distributed computation in your application? (default 'y') [y/n]: ") or DEFAULT_SPARK - spark = to_bool(spark) - spark_version = DEFAULT_SPARK_MAJOR - scala_version = DEFAULT_SCALA_VERSION - if spark: - spark_details = input("We use Spark {} and Scala {} by default, is this OK for you? (default 'y') [y/n]: ".format(DEFAULT_SPARK_MAJOR, - DEFAULT_SCALA_VERSION)) or DEFAULT_SPARK_DETAILS - if not spark_details[0] in ["Y", "y"]: - spark_version = input("Which which major Spark release would you like to use? (default '%s'): " % - DEFAULT_SPARK_MAJOR) or DEFAULT_SPARK_MAJOR - scala_version = input("Which Scala version would you like to use? (default '%s'): " % - DEFAULT_SCALA_VERSION) or DEFAULT_SCALA_VERSION - - cli_out = { - 'dl4j_version': dl4j_version, - 'nd4j_backend': backend, - 'dl4j_core': dl4j_core, - 'datavec': datavec, - 'spark': spark, - 'spark_version': spark_version, - 'scala_version': scala_version - } - - validate_config(cli_out) - formatted_json = json.dumps(cli_out, sort_keys=False, indent=2) - - click.echo("\nThis is your current settings file " + - click.style("config.json", bold=True) + ":\n") - click.echo(click.style(formatted_json, fg="green", bold=True)) - - confirm = input( - "\nDoes this look good? (default 'y') [y/n]: ") or 'yes' - if not to_bool(confirm): - click.echo( - "" + click.style("Please initialize pydl4j once again", fg="red", bold=True)) - return - - set_config(cli_out) - - def install(self): - if is_docker_available(): - use_docker = input( - "Docker available on your system. Would you like to use docker for installation> (default 'y')[y/n]: ") or 'yes' - if to_bool(use_docker): - click.echo(click.style( - "Docker is running, starting installation.", fg="green", bold=True)) - click.echo(click.style("========\n\nNote that this might take some time to complete.\n" + - "We will first pull a docker container with Maven, then install all dependencies selected with 'pydl4j init'.\n" + - "After completion you can start using DL4J from Python.\n\n========", fg="green", bold=False)) - _maven_build(use_docker=True) - else: - click.echo(click.style("========\n\nNote that this might take some time to complete.\n" + - "After completion you can start using DL4J from Python.\n\n========", fg="green", bold=False)) - - _maven_build(use_docker=False) - else: - click.echo( - "" + click.style("Could not detect docker on your system.", fg="red", bold=True)) - click.echo(click.style("========\n\nNote that this might take some time to complete.\n" + - "After completion you can start using DL4J from Python.\n\n========", fg="green", bold=False)) - - _maven_build(use_docker=False) - - -def handle(): - try: - cli = CLI() - sys.exit(cli.command_dispatcher()) - except KeyboardInterrupt: - sys.exit() - except Exception as e: - click.echo(click.style("Error: ", fg='red', bold=True)) - traceback.print_exc() - sys.exit() - - -if __name__ == '__main__': - handle() diff --git a/contrib/attic/pydl4j/pydl4j/docker.py b/contrib/attic/pydl4j/pydl4j/docker.py deleted file mode 100644 index ebb4b4965..000000000 --- a/contrib/attic/pydl4j/pydl4j/docker.py +++ /dev/null @@ -1,41 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - - -def docker_file(): - return """FROM java:openjdk-8-jdk -ENV MAVEN_VERSION 3.3.9 - -RUN curl -fsSL http://archive.apache.org/dist/maven/maven-3/$MAVEN_VERSION/binaries/apache-maven-$MAVEN_VERSION-bin.tar.gz | tar xzf - -C /usr/share \ - && mv /usr/share/apache-maven-$MAVEN_VERSION /usr/share/maven \ - && ln -s /usr/share/maven/bin/mvn /usr/bin/mvn - -ENV MAVEN_HOME /usr/share/maven - -# Copy application to container -RUN mkdir -p app -WORKDIR /app - -CMD ["mvn", "package"] -""" diff --git a/contrib/attic/pydl4j/pydl4j/downloader.py b/contrib/attic/pydl4j/pydl4j/downloader.py deleted file mode 100644 index 724a2a550..000000000 --- a/contrib/attic/pydl4j/pydl4j/downloader.py +++ /dev/null @@ -1,91 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from .progressbar import ProgressBar -import requests -import math -import os -import hashlib - - -def download(url, file_name): - r = requests.get(url, stream=True) - file_size = int(r.headers['Content-length']) - ''' - if py3: - file_size = int(u.getheader("Content-Length")[0]) - else: - file_size = int(u.info().getheaders("Content-Length")[0]) - ''' - file_exists = False - if os.path.isfile(file_name): - local_file_size = os.path.getsize(file_name) - if local_file_size == file_size: - sha1_file = file_name + '.sha1' - if os.path.isfile(sha1_file): - print('sha1 found') - with open(sha1_file) as f: - expected_sha1 = f.read() - BLOCKSIZE = 65536 - sha1 = hashlib.sha1() - with open(file_name) as f: - buff = f.read(BLOCKSIZE) - while len(buff) > 0: - sha1.update(buff) - buff = f.read(BLOCKSIZE) - if expected_sha1 == sha1: - file_exists = True - else: - print("File corrupt. Downloading again.") - os.remove(file_name) - else: - file_exists = True - else: - print("File corrupt. Downloading again.") - os.remove(file_name) - if not file_exists: - factor = int(math.floor(math.log(file_size) / math.log(1024))) - display_file_size = str(file_size / 1024 ** factor) + \ - ['B', 'KB', 'MB', 'GB', 'TB', 'PB'][factor] - print("Source: " + url) - print("Destination " + file_name) - print("Size: " + display_file_size) - file_size_dl = 0 - block_sz = 8192 - f = open(file_name, 'wb') - pbar = ProgressBar(file_size) - for chunk in r.iter_content(chunk_size=block_sz): - if not chunk: - continue - chunk_size = len(chunk) - file_size_dl += chunk_size - f.write(chunk) - pbar.update(chunk_size) - # status = r"%10d [%3.2f%%]" % (file_size_dl, file_size_dl * 100. / file_size) - # status = status + chr(8)*(len(status)+1) - # print(status) - f.close() - else: - print("File already exists - " + file_name) - return True diff --git a/contrib/attic/pydl4j/pydl4j/jarmgr.py b/contrib/attic/pydl4j/pydl4j/jarmgr.py deleted file mode 100644 index 213ad74e7..000000000 --- a/contrib/attic/pydl4j/pydl4j/jarmgr.py +++ /dev/null @@ -1,193 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from .downloader import download as download_file -from .mvn import * -import requests -import json -import os - - -def mkdir(x): - if not os.path.isdir(x): - os.mkdir(x) - - -_CONTEXT_NAME = None -_CONTEXT_DIR = None -_USER_PATH = os.path.expanduser('~') -_DL4J_DIR = os.path.join(_USER_PATH, '.deeplearning4j') -mkdir(_DL4J_DIR) -_MY_DIR = os.path.join(_DL4J_DIR, 'pydl4j') -mkdir(_MY_DIR) - -_URLS_FILE = os.path.join(_MY_DIR, 'urls.json') -if os.path.isfile(_URLS_FILE): - with open(_URLS_FILE, 'r') as f: - _URLS = json.load(f) -else: - _URLS = {} - - -def _write_urls(): - with open(_URLS_FILE, 'w') as f: - json.dump(_URLS, f) - - -_cache = {} - - -def _read(url): - text = _cache.get(url) - if text is None: - text = requests.get(url).text - if not text: - raise Exception('Empty response. Check connectivity.') - _cache[url] = text - return text - - -def _parse_contents(text): - contents = text.split('
')[1]
-    contents = contents.split('
')[0] - contents = contents.split('')[1] - contents = contents.split('')[0] - contents = contents.split(' - org.deeplearning4j - deeplearning4j-core - ${project.version} - """ - - -def spark_dependencies(): - return """ - org.deeplearning4j - dl4j-spark-parameterserver_{scala.binary.version} - {dl4j.spark.version} - """ - - -def datavec_dependencies(): - return """ - org.datavec - datavec-api - ${project.version} - - - org.datavec - datavec-local - ${project.version} - - - org.datavec - datavec-arrow - ${project.version} - - - org.datavec - datavec-camel - ${project.version} - - - org.datavec - datavec-excel - ${project.version} - - - org.datavec - datavec-geo - ${project.version} - - - org.datavec - datavec-hadoop - ${project.version} - - - org.datavec - datavec-jdbc - ${project.version} - - - org.datavec - datavec-perf - ${project.version} - - - org.datavec - datavec-spark_{scala.binary.version} - {dl4j.spark.version} - -""" - - -def pom_template(): - return """ - - - - - - 4.0.0 - org.deeplearning4j - pydl4j - {dl4j.version} - jar - - pydl4j - - - 3.0.0 - 1.5.4 - 1.5.4 - 0.3.10 - - - - - Apache License, Version 2.0 - http://www.apache.org/licenses/LICENSE-2.0.txt - repo - - - - - - org.bytedeco - openblas-platform - ${openblas.version}-${javacpp-presets.version} - - - org.nd4j - {nd4j.platform.backend} - ${project.version} - - - org.nd4j - {nd4j.backend} - ${project.version} - linux-x86_64 - - - org.nd4j - {nd4j.backend} - ${project.version} - windows-x86_64 - - {dl4j.core.dependencies} - {spark.dependencies} - {datavec.dependencies} - - ch.qos.logback - logback-classic - 1.2.3 - - - - - - snapshots-repo - https://oss.sonatype.org/content/repositories/snapshots - - false - - - true - daily - - - - - - - - org.apache.maven.plugins - maven-shade-plugin - ${maven-shade-plugin.version} - - true - bin - true - - - *:* - - org/datanucleus/** - META-INF/*.SF - META-INF/*.DSA - META-INF/*.RSA - - - - - - - package - - shade - - - - - reference.conf - - - - - - - - - - - org.apache.maven.plugins - maven-compiler-plugin - 3.1 - - 1.8 - 1.8 - - - - org.apache.maven.plugins - maven-jar-plugin - - true - - - - empty-javadoc-jar - package - - jar - - - false - javadoc - ${basedir}/javadoc - - - - empty-sources-jar - package - - jar - - - sources - ${basedir}/src - - - - - - - - - - """ diff --git a/contrib/attic/pydl4j/pydl4j/progressbar.py b/contrib/attic/pydl4j/pydl4j/progressbar.py deleted file mode 100644 index 53bf36807..000000000 --- a/contrib/attic/pydl4j/pydl4j/progressbar.py +++ /dev/null @@ -1,144 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import sys -import time -import math - - -def _mean(x): - return float(sum(x)) / len(x) - - -class ProgressBar(object): - """Displays a progress bar. - - # Arguments - target: Total number of steps expected, None if unknown. - interval: Minimum visual progress update interval (in seconds). - """ - - def __init__(self, target, width=30, verbose=1, interval=0.05): - self.width = width - if target is None: - target = -1 - self.target = target - self.sum_values = {} - self.unique_values = [] - self.start = time.time() - self.last_update = 0 - self.interval = interval - self.total_width = 0 - self.seen_so_far = 0 - self.verbose = verbose - - def set_value(self, current, values=None, force=False): - values = values or [] - for k, v in values: - if k not in self.sum_values: - self.sum_values[k] = [v * (current - self.seen_so_far), - current - self.seen_so_far] - self.unique_values.append(k) - else: - self.sum_values[k][0] += v * (current - self.seen_so_far) - self.sum_values[k][1] += (current - self.seen_so_far) - self.seen_so_far = current - - now = time.time() - if self.verbose == 1: - if not force and (now - self.last_update) < self.interval: - return - - prev_total_width = self.total_width - sys.stdout.write('\b' * prev_total_width) - sys.stdout.write('\r') - - if self.target is not -1: - numdigits = int(math.floor(math.log(self.target, 10))) + 1 - barstr = '%%%dd/%%%dd [' % (numdigits, numdigits) - bar = barstr % (current, self.target) - prog = float(current) / self.target - prog_width = int(self.width * prog) - if prog_width > 0: - bar += ('=' * (prog_width - 1)) - if current < self.target: - bar += '>' - else: - bar += '=' - bar += ('.' * (self.width - prog_width)) - bar += ']' - sys.stdout.write(bar) - self.total_width = len(bar) - - if current: - time_per_unit = (now - self.start) / current - else: - time_per_unit = 0 - eta = time_per_unit * (self.target - current) - perc = float(current) * 100 / self.target - info = '' - if current < self.target and self.target is not -1: - info += ' - %f%%' % perc - info += ' - ETA: %ds' % eta - else: - info += ' - %ds' % (now - self.start) - for k in self.unique_values: - info += ' - %s:' % k - if isinstance(self.sum_values[k], list): - avg = _mean( - self.sum_values[k][0] / max(1, self.sum_values[k][1])) - if abs(avg) > 1e-3: - info += ' %.4f' % avg - else: - info += ' %.4e' % avg - else: - info += ' %s' % self.sum_values[k] - - self.total_width += len(info) - if prev_total_width > self.total_width: - info += ((prev_total_width - self.total_width) * ' ') - - sys.stdout.write(info) - sys.stdout.flush() - - if current >= self.target: - sys.stdout.write('\n') - - if self.verbose == 2: - if current >= self.target: - info = '%ds' % (now - self.start) - for k in self.unique_values: - info += ' - %s:' % k - avg = _mean( - self.sum_values[k][0] / max(1, self.sum_values[k][1])) - if avg > 1e-3: - info += ' %.4f' % avg - else: - info += ' %.4e' % avg - sys.stdout.write(info + "\n") - - self.last_update = now - - def update(self, n=1, values=None): - self.set_value(self.seen_so_far + n, values) diff --git a/contrib/attic/pydl4j/pydl4j/pydl4j.py b/contrib/attic/pydl4j/pydl4j/pydl4j.py deleted file mode 100644 index 6a885d7b8..000000000 --- a/contrib/attic/pydl4j/pydl4j/pydl4j.py +++ /dev/null @@ -1,361 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from .jarmgr import * -from .jarmgr import _MY_DIR -from .pom import * -from .docker import docker_file -import platform -import os -import warnings -import os -from subprocess import call as py_call -import json - - -def call(arglist): - error = py_call(arglist) - if error: - raise Exception('Subprocess error for command: ' + str(arglist)) - - -_CONFIG_FILE = os.path.join(_MY_DIR, 'config.json') - - -# Default config -_CONFIG = { - 'dl4j_version': '1.0.0-SNAPSHOT', - 'dl4j_core': True, - 'datavec': True, - 'spark': True, - 'spark_version': '2', - 'scala_version': '2.11', - 'nd4j_backend': 'cpu', - 'validate_jars': True -} - - -def _is_sub_set(config1, config2): - # check if config1 is a subset of config2 - # if config1 < config2, then we can use config2 jar - # for config1 as well - if config1['dl4j_version'] != config1['dl4j_version']: - return False - if config1['dl4j_core'] > config2['dl4j_core']: - return False - if config1['nd4j_backend'] != config2['nd4j_backend']: - return False - if config1['datavec']: - if not config2['datavec']: - return False - if config1['spark'] > config2['spark']: - return False - if config1['spark_version'] != config2['spark_version']: - return False - if config1['scala_version'] != config2['scala_version']: - return False - return True - - -def _write_config(filepath=None): - if not filepath: - filepath = _CONFIG_FILE - with open(filepath, 'w') as f: - json.dump(_CONFIG, f) - - -if os.path.isfile(_CONFIG_FILE): - with open(_CONFIG_FILE, 'r') as f: - _CONFIG.update(json.load(f)) -else: - _write_config() - - -def set_config(config): - _CONFIG.update(config) - _write_config() - - -def get_config(): - return _CONFIG - - -def validate_config(config=None): - if config is None: - config = _CONFIG - valid_options = { - 'spark_version': ['1', '2'], - 'scala_version': ['2.10', '2.11'], - 'nd4j_backend': ['cpu', 'gpu'] - } - for k, vs in valid_options.items(): - v = config.get(k) - if v is None: - raise KeyError('Key not found in config : {}.'.format(k)) - if v not in vs: - raise ValueError( - 'Invalid value {} for key {} in config. Valid values are: {}.'.format(v, k, vs)) - - # spark 2 does not work with scala 2.10 - if config['spark_version'] == '2' and config['scala_version'] == '2.10': - raise ValueError( - 'Scala 2.10 does not work with spark 2. Set scala_version to 2.11 in pydl4j config. ') - - -def _get_context_from_config(config=None): - if not config: - config = _CONFIG - # e.g pydl4j-1.0.0-SNAPSHOT-cpu-core-datavec-spark2-2.11 - - context = 'pydl4j-{}'.format(config['dl4j_version']) - context += '-' + config['nd4j_backend'] - if config['dl4j_core']: - context += '-core' - if config['datavec']: - context += '-datavec' - if config['spark']: - spark_version = config['spark_version'] - scala_version = config['scala_version'] - context += '-spark' + spark_version + '-' + scala_version - return context - - -def _get_config_from_context(context): - config = {} - backends = ['cpu', 'gpu'] - for b in backends: - if '-' + b in context: - config['nd4j_backend'] = b - config['dl4j_version'] = context.split('-' + b)[0][len('pydl4j-'):] - break - config['dl4j_core'] = '-core' in context - set_defs = False - if '-datavec' in context: - config['datavec'] = True - if '-spark' in context: - config['spark'] = True - sp_sc_ver = context.split('-spark')[1] - sp_ver, sc_ver = sp_sc_ver.split('-') - config['spark_version'] = sp_ver - config['scala_version'] = sc_ver - else: - config['spark'] = False - set_defs = True - else: - config['datavec'] = False - set_defs = True - if set_defs: - config['spark_version'] = '2' - config['scala_version'] = '2.11' - validate_config(config) - return config - - -set_context(_get_context_from_config()) - - -def create_pom_from_config(): - config = get_config() - pom = pom_template() - dl4j_version = config['dl4j_version'] - nd4j_backend = config['nd4j_backend'] - use_spark = config['spark'] - scala_version = config['scala_version'] - spark_version = config['spark_version'] - use_dl4j_core = config['dl4j_core'] - use_datavec = config['datavec'] - - datavec_deps = datavec_dependencies() if use_datavec else "" - pom = pom.replace('{datavec.dependencies}', datavec_deps) - - core_deps = dl4j_core_dependencies() if use_dl4j_core else "" - pom = pom.replace('{dl4j.core.dependencies}', core_deps) - - spark_deps = spark_dependencies() if use_spark else "" - pom = pom.replace('{spark.dependencies}', spark_deps) - - pom = pom.replace('{dl4j.version}', dl4j_version) - - if nd4j_backend == 'cpu': - platform_backend = "nd4j-native-platform" - backend = "nd4j-native" - else: - platform_backend = "nd4j-cuda-9.2-platform" - platform_backend = "nd4j-cuda-9.2" - - pom = pom.replace('{nd4j.backend}', backend) - pom = pom.replace('{nd4j.platform.backend}', platform_backend) - - if use_spark: - pom = pom.replace('{scala.binary.version}', scala_version) - # this naming convention seems a little off - if "SNAPSHOT" in dl4j_version: - dl4j_version = dl4j_version.replace("-SNAPSHOT", "") - dl4j_spark_version = dl4j_version + "_spark_" + spark_version + "-SNAPSHOT" - else: - dl4j_spark_version = dl4j_version + "_spark_" + spark_version - pom = pom.replace('{dl4j.spark.version}', dl4j_spark_version) - - # TODO replace if exists - pom_xml = os.path.join(_MY_DIR, 'pom.xml') - with open(pom_xml, 'w') as pom_file: - pom_file.write(pom) - - -def docker_build(): - docker_path = os.path.join(_MY_DIR, 'Dockerfile') - docker_string = docker_file() - with open(docker_path, 'w') as f: - f.write(docker_string) - - call(["docker", "build", _MY_DIR, "-t", "pydl4j"]) - - -def docker_run(): - create_pom_from_config() - py_call(["docker", "run", "--mount", "src=" + - _MY_DIR + ",target=/app,type=bind", "pydl4j"]) - # docker will build into /target, need to move to context dir - context_dir = get_dir() - config = get_config() - dl4j_version = config['dl4j_version'] - jar_name = "pydl4j-{}-bin.jar".format(dl4j_version) - base_target_dir = os.path.join(_MY_DIR, "target") - source = os.path.join(base_target_dir, jar_name) - target = os.path.join(context_dir, jar_name) - _write_config(os.path.join(context_dir, 'config.json')) - if os.path.isfile(target): - os.remove(target) - os.rename(source, target) - - -def is_docker_available(): - devnull = open(os.devnull, 'w') - try: - py_call(["docker", "--help"], stdout=devnull, stderr=devnull) - return True - except Exception: - return False - - -def _maven_build(use_docker): - if use_docker: - docker_build() - docker_run() - else: - create_pom_from_config() - pom_xml = os.path.join(_MY_DIR, 'pom.xml') - command = 'mvn clean install -f ' + pom_xml - os.system(command) - version = _CONFIG['dl4j_version'] - jar_name = "pydl4j-{}-bin.jar".format(version) - source = os.path.join(_MY_DIR, 'target', jar_name) - target = os.path.join(get_dir(), jar_name) - if os.path.isfile(target): - os.remove(target) - os.rename(source, target) - - -def maven_build(): - if is_docker_available(): - print("Docker available. Starting build...") - _maven_build(use_docker=True) - else: - warnings.warn( - "Docker unavailable. Attempting alternate implementation.") - _maven_build(use_docker=False) - - -def validate_jars(): - if not _CONFIG['validate_jars']: - return - # builds jar if not available for given context - jars = get_jars() - dl4j_version = _CONFIG['dl4j_version'] - jar = "pydl4j-{}-bin.jar".format(dl4j_version) - if jar not in jars: - # jar not found - # but its possible a jar exists in a different - # context. If that context is a "super set" of - # of the current one, we can use its jar! - original_context = context() - contexts = _get_all_contexts() - found_super_set_jar = False - for c in contexts: - config = _get_config_from_context(c) - if _is_sub_set(_CONFIG, config): - set_context(c) - jars = get_jars() - if jar in jars: - found_super_set_jar = True - break - if not found_super_set_jar: - set_context(original_context) - print("pdl4j: required uberjar not found, building with docker...") - maven_build() - - -def validate_nd4j_jars(): - validate_jars() - - -def validate_datavec_jars(): - if not _CONFIG['datavec']: - _CONFIG['datavec'] = True - _write_config() - context = _get_context_from_config() - set_context(context) - validate_jars() - - -def _get_all_contexts(): - c = os.listdir(_MY_DIR) - return [x for x in c if x.startswith('pydl4j')] - - -def set_jnius_config(): - try: - import jnius_config - path = get_dir() - if path[-1] == '*': - jnius_config.add_classpath(path) - elif os.path.isfile(path): - jnius_config.add_classpath(path) - else: - path = os.path.join(path, '*') - jnius_config.add_classpath(path) - # Further options can be set by individual projects - except ImportError: - warnings.warn('Pyjnius not installed.') - - -def add_classpath(path): - try: - import jnius_config - jnius_config.add_classpath(path) - except ImportError: - warnings.warn('Pyjnius not installed.') - - -set_jnius_config() diff --git a/contrib/attic/pydl4j/pytest.ini b/contrib/attic/pydl4j/pytest.ini deleted file mode 100644 index d4f1ab57b..000000000 --- a/contrib/attic/pydl4j/pytest.ini +++ /dev/null @@ -1,10 +0,0 @@ -[pytest] - -norecursedirs = build - -# PEP-8 The following are ignored: -# E501 line too long (82 > 79 characters) -# W503 line break occurred before a binary operator - -pep8ignore = * E501 \ - * W503 diff --git a/contrib/attic/pydl4j/release.sh b/contrib/attic/pydl4j/release.sh deleted file mode 100644 index 497acf4b5..000000000 --- a/contrib/attic/pydl4j/release.sh +++ /dev/null @@ -1,35 +0,0 @@ -#!/bin/bash - -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - - - - - -# remove old wheels -sudo rm - rf dist/* - -# Build Python 2 & 3 wheels for current version -sudo python2 setup.py sdist bdist_wheel -sudo python3 setup.py sdist bdist_wheel - -# Upload to PyPI with twine. Needs full "skymind" credentials in ~/.pypirc -twine upload dist/* diff --git a/contrib/attic/pydl4j/setup.py b/contrib/attic/pydl4j/setup.py deleted file mode 100644 index 7797c8bda..000000000 --- a/contrib/attic/pydl4j/setup.py +++ /dev/null @@ -1,56 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -from setuptools import setup -from setuptools import find_packages - -setup( - name='pydl4j', - version='0.1.5', - packages=find_packages(), - install_requires=['Cython', 'pyjnius', 'requests', - 'click', 'argcomplete', 'python-dateutil'], - extras_require={ - 'tests': ['pytest', 'pytest-pep8', 'pytest-cov'] - }, - include_package_data=True, - license='Apache', - description='Java dependency management for Python projects using DL4J', - url='https://github.com/deeplearning4j/pydl4j', - entry_points={ - 'console_scripts': [ - 'pydl4j=pydl4j.cli:handle' - ] - }, - classifiers=[ - 'Development Status :: 3 - Alpha', - 'Intended Audience :: Developers', - 'Environment :: Console', - 'License :: OSI Approved :: Apache Software License', - 'Operating System :: OS Independent', - 'Programming Language :: Python', - 'Programming Language :: Python :: 2', - 'Programming Language :: Python :: 3' - ] -) diff --git a/contrib/attic/pydl4j/tests/basic_example.csv b/contrib/attic/pydl4j/tests/basic_example.csv deleted file mode 100644 index fed539b30..000000000 --- a/contrib/attic/pydl4j/tests/basic_example.csv +++ /dev/null @@ -1,7 +0,0 @@ -2016-01-01 17:00:00.000,830a7u3,u323fy8902,1,USA,100.00,Legit -2016-01-01 18:03:01.256,830a7u3,9732498oeu,3,FR,73.20,Legit -2016-01-03 02:53:32.231,78ueoau32,w234e989,1,USA,1621.00,Fraud -2016-01-03 09:30:16.832,t842uocd,9732498oeu,4,USA,43.19,Legit -2016-01-04 23:01:52.920,t842uocd,cza8873bm,10,MX,159.65,Legit -2016-01-05 02:28:10.648,t842uocd,fgcq9803,6,CAN,26.33,Fraud -2016-01-05 10:15:36.483,rgc707ke3,tn342v7,2,USA,-0.90,Legit diff --git a/contrib/attic/pydl4j/tests/build_tests/basic_example.csv b/contrib/attic/pydl4j/tests/build_tests/basic_example.csv deleted file mode 100644 index 4373a1491..000000000 --- a/contrib/attic/pydl4j/tests/build_tests/basic_example.csv +++ /dev/null @@ -1,7 +0,0 @@ -2016-01-01 17:00:00.000,830a7u3,u323fy8902,1,USA,100.00,Legit -2016-01-01 18:03:01.256,830a7u3,9732498oeu,3,FR,73.20,Legit -2016-01-03 02:53:32.231,78ueoau32,w234e989,1,USA,1621.00,Fraud -2016-01-03 09:30:16.832,t842uocd,9732498oeu,4,USA,43.19,Legit -2016-01-04 23:01:52.920,t842uocd,cza8873bm,10,MX,159.65,Legit -2016-01-05 02:28:10.648,t842uocd,fgcq9803,6,CAN,26.33,Fraud -2016-01-05 10:15:36.483,rgc707ke3,tn342v7,2,USA,-0.90,Legit diff --git a/contrib/attic/pydl4j/tests/build_tests/test_build_1.py b/contrib/attic/pydl4j/tests/build_tests/test_build_1.py deleted file mode 100644 index ed99467b5..000000000 --- a/contrib/attic/pydl4j/tests/build_tests/test_build_1.py +++ /dev/null @@ -1,82 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -import pydl4j -import os - - -def _spark_test(): - from jnius import autoclass - SparkConf = autoclass('org.apache.spark.SparkConf') - SparkContext = autoclass('org.apache.spark.api.java.JavaSparkContext') - JavaRDD = autoclass('org.apache.spark.api.java.JavaRDD') - SparkTransformExecutor = autoclass('org.datavec.spark.' - 'transform.SparkTransformExecutor') - StringToWritablesFunction = autoclass('org.datavec.spark.' - 'transform.misc.' - 'StringToWritablesFunction') - WritablesToStringFunction = autoclass('org.datavec.spark.' - 'transform.misc.' - 'WritablesToStringFunction') - - spark_conf = SparkConf() - spark_conf.setMaster('local[*]') - spark_conf.setAppName('test') - - spark_context = SparkContext(spark_conf) - source = 'basic_example.csv' - assert os.path.isfile(source) - string_data = spark_context.textFile(source) - - -def test_build(): - _CONFIG = { - 'dl4j_version': '1.0.0-SNAPSHOT', - 'dl4j_core': True, - 'datavec': True, - 'spark': True, - 'spark_version': '2', - 'scala_version': '2.11', - 'nd4j_backend': 'cpu' - } - - my_dir = pydl4j.jarmgr._MY_DIR - - if os.path.isdir(my_dir): - os.remove(my_dir) - - pydl4j.set_config(_CONFIG) - - pydl4j.maven_build() - - import jumpy as jp - - assert jp.zeros((3, 2)).numpy().sum() == 0 - - _spark_test() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydl4j/tests/build_tests/test_build_2.py b/contrib/attic/pydl4j/tests/build_tests/test_build_2.py deleted file mode 100644 index 068808df5..000000000 --- a/contrib/attic/pydl4j/tests/build_tests/test_build_2.py +++ /dev/null @@ -1,82 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -import pydl4j -import os - - -def _spark_test(): - from jnius import autoclass - SparkConf = autoclass('org.apache.spark.SparkConf') - SparkContext = autoclass('org.apache.spark.api.java.JavaSparkContext') - JavaRDD = autoclass('org.apache.spark.api.java.JavaRDD') - SparkTransformExecutor = autoclass('org.datavec.spark.' - 'transform.SparkTransformExecutor') - StringToWritablesFunction = autoclass('org.datavec.spark.' - 'transform.misc.' - 'StringToWritablesFunction') - WritablesToStringFunction = autoclass('org.datavec.spark.' - 'transform.misc.' - 'WritablesToStringFunction') - - spark_conf = SparkConf() - spark_conf.setMaster('local[*]') - spark_conf.setAppName('test') - - spark_context = SparkContext(spark_conf) - source = 'basic_example.csv' - assert os.path.isfile(source) - string_data = spark_context.textFile(source) - - -def test_build(): - _CONFIG = { - 'dl4j_version': '1.0.0-SNAPSHOT', - 'dl4j_core': False, - 'datavec': True, - 'spark': True, - 'spark_version': '2', - 'scala_version': '2.11', - 'nd4j_backend': 'cpu' - } - - my_dir = pydl4j.jarmgr._MY_DIR - - if os.path.isdir(my_dir): - os.remove(my_dir) - - pydl4j.set_config(_CONFIG) - - pydl4j.maven_build() - - import jumpy as jp - - assert jp.zeros((3, 2)).numpy().sum() == 0 - - _spark_test() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydl4j/tests/build_tests/test_build_3.py b/contrib/attic/pydl4j/tests/build_tests/test_build_3.py deleted file mode 100644 index bcfdf9faa..000000000 --- a/contrib/attic/pydl4j/tests/build_tests/test_build_3.py +++ /dev/null @@ -1,82 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -import pydl4j -import os - - -def _spark_test(): - from jnius import autoclass - SparkConf = autoclass('org.apache.spark.SparkConf') - SparkContext = autoclass('org.apache.spark.api.java.JavaSparkContext') - JavaRDD = autoclass('org.apache.spark.api.java.JavaRDD') - SparkTransformExecutor = autoclass('org.datavec.spark.' - 'transform.SparkTransformExecutor') - StringToWritablesFunction = autoclass('org.datavec.spark.' - 'transform.misc.' - 'StringToWritablesFunction') - WritablesToStringFunction = autoclass('org.datavec.spark.' - 'transform.misc.' - 'WritablesToStringFunction') - - spark_conf = SparkConf() - spark_conf.setMaster('local[*]') - spark_conf.setAppName('test') - - spark_context = SparkContext(spark_conf) - source = 'basic_example.csv' - assert os.path.isfile(source) - string_data = spark_context.textFile(source) - - -def test_build(): - _CONFIG = { - 'dl4j_version': '1.0.0-SNAPSHOT', - 'dl4j_core': True, - 'datavec': True, - 'spark': True, - 'spark_version': '1', - 'scala_version': '2.10', - 'nd4j_backend': 'cpu' - } - - my_dir = pydl4j.jarmgr._MY_DIR - - if os.path.isdir(my_dir): - os.remove(my_dir) - - pydl4j.set_config(_CONFIG) - - pydl4j.maven_build() - - import jumpy as jp - - assert jp.zeros((3, 2)).numpy().sum() == 0 - - _spark_test() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydl4j/tests/build_tests/test_build_4.py b/contrib/attic/pydl4j/tests/build_tests/test_build_4.py deleted file mode 100644 index 25ec30f31..000000000 --- a/contrib/attic/pydl4j/tests/build_tests/test_build_4.py +++ /dev/null @@ -1,82 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -import pydl4j -import os - - -def _spark_test(): - from jnius import autoclass - SparkConf = autoclass('org.apache.spark.SparkConf') - SparkContext = autoclass('org.apache.spark.api.java.JavaSparkContext') - JavaRDD = autoclass('org.apache.spark.api.java.JavaRDD') - SparkTransformExecutor = autoclass('org.datavec.spark.' - 'transform.SparkTransformExecutor') - StringToWritablesFunction = autoclass('org.datavec.spark.' - 'transform.misc.' - 'StringToWritablesFunction') - WritablesToStringFunction = autoclass('org.datavec.spark.' - 'transform.misc.' - 'WritablesToStringFunction') - - spark_conf = SparkConf() - spark_conf.setMaster('local[*]') - spark_conf.setAppName('test') - - spark_context = SparkContext(spark_conf) - source = 'basic_example.csv' - assert os.path.isfile(source) - string_data = spark_context.textFile(source) - - -def test_build(): - _CONFIG = { - 'dl4j_version': '1.0.0-SNAPSHOT', - 'dl4j_core': True, - 'datavec': True, - 'spark': True, - 'spark_version': '1', - 'scala_version': '2.11', - 'nd4j_backend': 'cpu' - } - - my_dir = pydl4j.jarmgr._MY_DIR - - if os.path.isdir(my_dir): - os.remove(my_dir) - - pydl4j.set_config(_CONFIG) - - pydl4j.maven_build() - - import jumpy as jp - - assert jp.zeros((3, 2)).numpy().sum() == 0 - - _spark_test() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydl4j/tests/build_tests/test_build_5.py b/contrib/attic/pydl4j/tests/build_tests/test_build_5.py deleted file mode 100644 index dfcc5f15e..000000000 --- a/contrib/attic/pydl4j/tests/build_tests/test_build_5.py +++ /dev/null @@ -1,65 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -import pydl4j -import os - - -def _datavec_test(): - from jnius import autoclass - SparkConf = autoclass('org.apache.spark.SparkConf') - SparkContext = autoclass('org.apache.spark.api.java.JavaSparkContext') - JavaRDD = autoclass('org.apache.spark.api.java.JavaRDD') - - -def test_build(): - _CONFIG = { - 'dl4j_version': '1.0.0-SNAPSHOT', - 'dl4j_core': True, - 'datavec': True, - 'spark': False, - 'spark_version': '2', - 'scala_version': '2.11', - 'nd4j_backend': 'cpu' - } - - my_dir = pydl4j.jarmgr._MY_DIR - - if os.path.isdir(my_dir): - os.remove(my_dir) - - pydl4j.set_config(_CONFIG) - - pydl4j.maven_build() - - import jumpy as jp - - assert jp.zeros((3, 2)).numpy().sum() == 0 - - _datavec_test() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydl4j/tests/build_tests/test_build_6.py b/contrib/attic/pydl4j/tests/build_tests/test_build_6.py deleted file mode 100644 index 74527ab15..000000000 --- a/contrib/attic/pydl4j/tests/build_tests/test_build_6.py +++ /dev/null @@ -1,56 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -import pydl4j -import os - - -def test_build(): - _CONFIG = { - 'dl4j_version': '1.0.0-SNAPSHOT', - 'dl4j_core': True, - 'datavec': False, - 'spark': True, - 'spark_version': '2', - 'scala_version': '2.11', - 'nd4j_backend': 'cpu' - } - - my_dir = pydl4j.jarmgr._MY_DIR - - if os.path.isdir(my_dir): - os.remove(my_dir) - - pydl4j.set_config(_CONFIG) - - pydl4j.maven_build() - - import jumpy as jp - - assert jp.zeros((3, 2)).numpy().sum() == 0 - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydl4j/tests/mvn_test.py b/contrib/attic/pydl4j/tests/mvn_test.py deleted file mode 100644 index cacc1a1d5..000000000 --- a/contrib/attic/pydl4j/tests/mvn_test.py +++ /dev/null @@ -1,59 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -from pydl4j import * - - -def test_get_artifacts(): - artifacts = get_artifacts('datavec') - expected = ['datavec-api', 'datavec-local', 'datavec-parent'] - for e in expected: - assert e in artifacts - - -def test_get_versions(): - versions = get_versions('datavec', 'datavec-api') - assert len(versions) >= 12 - - -def test_get_latest_version(): - v = get_latest_version('datavec', 'datavec-api') - assert len(v) > 0 - - -def test_install(): - set_context('test') - clear_context() - mvn_install('datavec', 'datavec-api') - mvn_install('datavec', 'datavec-local') - assert len(get_jars()) == 2 - for jar in get_jars(): - uninstall(jar) - assert len(get_jars()) == 0 - clear_context() - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/pydl4j/tests/spark_test.py b/contrib/attic/pydl4j/tests/spark_test.py deleted file mode 100644 index 0db7ee4d8..000000000 --- a/contrib/attic/pydl4j/tests/spark_test.py +++ /dev/null @@ -1,62 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -################################################################################ -# -# -# -################################################################################ - -import pytest -import jnius_config -import os -import warnings -import pydl4j - - -def test_spark(): - # skip test in travis - if "TRAVIS" in os.environ and os.environ["TRAVIS"] == "true": - return - - pydl4j.validate_datavec_jars() - - from jnius import autoclass - - SparkConf = autoclass('org.apache.spark.SparkConf') - SparkContext = autoclass('org.apache.spark.api.java.JavaSparkContext') - JavaRDD = autoclass('org.apache.spark.api.java.JavaRDD') - SparkTransformExecutor = autoclass( - 'org.datavec.spark.transform.SparkTransformExecutor') - StringToWritablesFunction = autoclass( - 'org.datavec.spark.transform.misc.StringToWritablesFunction') - WritablesToStringFunction = autoclass( - 'org.datavec.spark.transform.misc.WritablesToStringFunction') - - spark_conf = SparkConf() - spark_conf.setMaster('local[*]') - spark_conf.setAppName('test') - - spark_context = SparkContext(spark_conf) - source = 'basic_example.csv' - assert os.path.isfile(source) - string_data = spark_context.textFile(source) - - -if __name__ == '__main__': - pytest.main([__file__]) diff --git a/contrib/attic/scalnet/.gitignore b/contrib/attic/scalnet/.gitignore deleted file mode 100644 index 26f6d0514..000000000 --- a/contrib/attic/scalnet/.gitignore +++ /dev/null @@ -1,21 +0,0 @@ -*.class -*.log -*.iml -lib - -# sbt specific -.cache -.history -.lib/ -dist/* -target/ -lib_managed/ -src_managed/ -project/boot/ -project/plugins/project/ - -# Scala-IDE specific -.scala_dependencies -.worksheet -.idea - diff --git a/contrib/attic/scalnet/.scalafmt.conf b/contrib/attic/scalnet/.scalafmt.conf deleted file mode 100644 index 00e9ced59..000000000 --- a/contrib/attic/scalnet/.scalafmt.conf +++ /dev/null @@ -1,9 +0,0 @@ -align = some -danglingParentheses = true -indentOperator = spray -maxColumn = 120 -lineEndings = unix -project.excludeFilters = [".*\\.sbt"] -rewrite.rules = [AsciiSortImports, RedundantBraces, RedundantParens] -spaces.inImportCurlyBraces = true -unindentTopLevelOperators = true diff --git a/contrib/attic/scalnet/LICENSE b/contrib/attic/scalnet/LICENSE deleted file mode 100644 index 8dada3eda..000000000 --- a/contrib/attic/scalnet/LICENSE +++ /dev/null @@ -1,201 +0,0 @@ - Apache License - Version 2.0, January 2004 - http://www.apache.org/licenses/ - - TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION - - 1. Definitions. - - "License" shall mean the terms and conditions for use, reproduction, - and distribution as defined by Sections 1 through 9 of this document. - - "Licensor" shall mean the copyright owner or entity authorized by - the copyright owner that is granting the License. - - "Legal Entity" shall mean the union of the acting entity and all - other entities that control, are controlled by, or are under common - control with that entity. For the purposes of this definition, - "control" means (i) the power, direct or indirect, to cause the - direction or management of such entity, whether by contract or - otherwise, or (ii) ownership of fifty percent (50%) or more of the - outstanding shares, or (iii) beneficial ownership of such entity. - - "You" (or "Your") shall mean an individual or Legal Entity - exercising permissions granted by this License. - - "Source" form shall mean the preferred form for making modifications, - including but not limited to software source code, documentation - source, and configuration files. - - "Object" form shall mean any form resulting from mechanical - transformation or translation of a Source form, including but - not limited to compiled object code, generated documentation, - and conversions to other media types. - - "Work" shall mean the work of authorship, whether in Source or - Object form, made available under the License, as indicated by a - copyright notice that is included in or attached to the work - (an example is provided in the Appendix below). - - "Derivative Works" shall mean any work, whether in Source or Object - form, that is based on (or derived from) the Work and for which the - editorial revisions, annotations, elaborations, or other modifications - represent, as a whole, an original work of authorship. For the purposes - of this License, Derivative Works shall not include works that remain - separable from, or merely link (or bind by name) to the interfaces of, - the Work and Derivative Works thereof. - - "Contribution" shall mean any work of authorship, including - the original version of the Work and any modifications or additions - to that Work or Derivative Works thereof, that is intentionally - submitted to Licensor for inclusion in the Work by the copyright owner - or by an individual or Legal Entity authorized to submit on behalf of - the copyright owner. For the purposes of this definition, "submitted" - means any form of electronic, verbal, or written communication sent - to the Licensor or its representatives, including but not limited to - communication on electronic mailing lists, source code control systems, - and issue tracking systems that are managed by, or on behalf of, the - Licensor for the purpose of discussing and improving the Work, but - excluding communication that is conspicuously marked or otherwise - designated in writing by the copyright owner as "Not a Contribution." - - "Contributor" shall mean Licensor and any individual or Legal Entity - on behalf of whom a Contribution has been received by Licensor and - subsequently incorporated within the Work. - - 2. Grant of Copyright License. Subject to the terms and conditions of - this License, each Contributor hereby grants to You a perpetual, - worldwide, non-exclusive, no-charge, royalty-free, irrevocable - copyright license to reproduce, prepare Derivative Works of, - publicly display, publicly perform, sublicense, and distribute the - Work and such Derivative Works in Source or Object form. - - 3. Grant of Patent License. Subject to the terms and conditions of - this License, each Contributor hereby grants to You a perpetual, - worldwide, non-exclusive, no-charge, royalty-free, irrevocable - (except as stated in this section) patent license to make, have made, - use, offer to sell, sell, import, and otherwise transfer the Work, - where such license applies only to those patent claims licensable - by such Contributor that are necessarily infringed by their - Contribution(s) alone or by combination of their Contribution(s) - with the Work to which such Contribution(s) was submitted. If You - institute patent litigation against any entity (including a - cross-claim or counterclaim in a lawsuit) alleging that the Work - or a Contribution incorporated within the Work constitutes direct - or contributory patent infringement, then any patent licenses - granted to You under this License for that Work shall terminate - as of the date such litigation is filed. - - 4. Redistribution. You may reproduce and distribute copies of the - Work or Derivative Works thereof in any medium, with or without - modifications, and in Source or Object form, provided that You - meet the following conditions: - - (a) You must give any other recipients of the Work or - Derivative Works a copy of this License; and - - (b) You must cause any modified files to carry prominent notices - stating that You changed the files; and - - (c) You must retain, in the Source form of any Derivative Works - that You distribute, all copyright, patent, trademark, and - attribution notices from the Source form of the Work, - excluding those notices that do not pertain to any part of - the Derivative Works; and - - (d) If the Work includes a "NOTICE" text file as part of its - distribution, then any Derivative Works that You distribute must - include a readable copy of the attribution notices contained - within such NOTICE file, excluding those notices that do not - pertain to any part of the Derivative Works, in at least one - of the following places: within a NOTICE text file distributed - as part of the Derivative Works; within the Source form or - documentation, if provided along with the Derivative Works; or, - within a display generated by the Derivative Works, if and - wherever such third-party notices normally appear. The contents - of the NOTICE file are for informational purposes only and - do not modify the License. You may add Your own attribution - notices within Derivative Works that You distribute, alongside - or as an addendum to the NOTICE text from the Work, provided - that such additional attribution notices cannot be construed - as modifying the License. - - You may add Your own copyright statement to Your modifications and - may provide additional or different license terms and conditions - for use, reproduction, or distribution of Your modifications, or - for any such Derivative Works as a whole, provided Your use, - reproduction, and distribution of the Work otherwise complies with - the conditions stated in this License. - - 5. Submission of Contributions. Unless You explicitly state otherwise, - any Contribution intentionally submitted for inclusion in the Work - by You to the Licensor shall be under the terms and conditions of - this License, without any additional terms or conditions. - Notwithstanding the above, nothing herein shall supersede or modify - the terms of any separate license agreement you may have executed - with Licensor regarding such Contributions. - - 6. Trademarks. This License does not grant permission to use the trade - names, trademarks, service marks, or product names of the Licensor, - except as required for reasonable and customary use in describing the - origin of the Work and reproducing the content of the NOTICE file. - - 7. Disclaimer of Warranty. Unless required by applicable law or - agreed to in writing, Licensor provides the Work (and each - Contributor provides its Contributions) on an "AS IS" BASIS, - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or - implied, including, without limitation, any warranties or conditions - of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A - PARTICULAR PURPOSE. You are solely responsible for determining the - appropriateness of using or redistributing the Work and assume any - risks associated with Your exercise of permissions under this License. - - 8. Limitation of Liability. In no event and under no legal theory, - whether in tort (including negligence), contract, or otherwise, - unless required by applicable law (such as deliberate and grossly - negligent acts) or agreed to in writing, shall any Contributor be - liable to You for damages, including any direct, indirect, special, - incidental, or consequential damages of any character arising as a - result of this License or out of the use or inability to use the - Work (including but not limited to damages for loss of goodwill, - work stoppage, computer failure or malfunction, or any and all - other commercial damages or losses), even if such Contributor - has been advised of the possibility of such damages. - - 9. Accepting Warranty or Additional Liability. While redistributing - the Work or Derivative Works thereof, You may choose to offer, - and charge a fee for, acceptance of support, warranty, indemnity, - or other liability obligations and/or rights consistent with this - License. However, in accepting such obligations, You may act only - on Your own behalf and on Your sole responsibility, not on behalf - of any other Contributor, and only if You agree to indemnify, - defend, and hold each Contributor harmless for any liability - incurred by, or claims asserted against, such Contributor by reason - of your accepting any such warranty or additional liability. - - END OF TERMS AND CONDITIONS - - APPENDIX: How to apply the Apache License to your work. - - To apply the Apache License to your work, attach the following - boilerplate notice, with the fields enclosed by brackets "{}" - replaced with your own identifying information. (Don't include - the brackets!) The text should be enclosed in the appropriate - comment syntax for the file format. We also recommend that a - file or class name and description of purpose be included on the - same "printed page" as the copyright notice for easier - identification within third-party archives. - - Copyright {yyyy} {name of copyright owner} - - Licensed under the Apache License, Version 2.0 (the "License"); - you may not use this file except in compliance with the License. - You may obtain a copy of the License at - - http://www.apache.org/licenses/LICENSE-2.0 - - Unless required by applicable law or agreed to in writing, software - distributed under the License is distributed on an "AS IS" BASIS, - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - See the License for the specific language governing permissions and - limitations under the License. diff --git a/contrib/attic/scalnet/README.md b/contrib/attic/scalnet/README.md deleted file mode 100644 index ab898b113..000000000 --- a/contrib/attic/scalnet/README.md +++ /dev/null @@ -1,94 +0,0 @@ -# ScalNet - -ScalNet is a wrapper around Deeplearning4J emulating a [Keras](https://github.com/fchollet/keras) like API for deep learning. - -ScalNet is released under an Apache 2.0 license. By contributing code to this repository, you agree to make your contribution available under an Apache 2.0 license. - -ScalNet is STILL ALPHA and we are open sourcing this in an attempt to get feedback. - -Come in to [gitter](https://gitter.im/deeplearning4j/deeplearning4j) if you are interested in learning more. - - -# Prerequisites - -* JDK 8 -* Scala 2.11.+ or 2.10.x -* SBT and Maven - - -# How to build - -ScalNet depends on Deeplearning4j and ND4J - -- [deeplearning4j and nd4j](https://github.com/eclipse/deeplearning4j) - -### sbt - -ScalNet uses sbt, but due to [resolving issues](https://nd4j.org/dependencies), you must have Maven available to copy some nd4j-native dependencies in your classpath, in order to run the examples. - -This is automatically done in `build.sbt` and you don't need to do anything besides having maven installed. - -If you use sbt in your own project, you will probably have to proceed the same way. When ScalNet will be using releases instead of snapshots, this won't be necessary anymore. - -To build, use: - -```scala -$ sbt package -``` - -Alternatively, for some quick testing or usage in Jupyter for example, run: - -```scala -$ sbt assembly -``` -To obtain a JAR file with all needed dependencies. - -See the [official sbt documentation](http://www.scala-sbt.org/documentation.html) for more on how to use sbt. - -### Maven - -Althought Maven is mainly used for release management, you can use the provided pom.xml to import ScalNet as a Maven project. - -Target for scala 2.11 - -```scala -$ change-scala-versions.sh "2.11" -$ mvn package -``` - -Target for scala 2.10 - -```scala -$ change-scala-versions.sh "2.10" -$ mvn package -``` - -# How to use - -### sbt - -```scala -libraryDependencies ++= "org.deeplearning4j" % "scalnet_2.11" % "0.9.2-SNAPSHOT" -``` - -### Maven - -```xml - - org.deeplearning4j - scalnet_2.11 - 0.9.2-SNAPSHOT - -``` - - -# Getting started - -ScalNet uses a Keras like API, wrapping deeplearning4j to make it more easier to start with. - -Also, since you can call Java code from Scala, you can still use everything from deeplearning4j. - -To see what ScalNet has to offer, run one of the [examples] -(https://github.com/eclipse/deeplearning4j/tree/master/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples) it ships with. - -Please note that those examples are not state-of-the-art in any way, they're just enough to get you started. diff --git a/contrib/attic/scalnet/build.sbt b/contrib/attic/scalnet/build.sbt deleted file mode 100644 index cdfb11dd8..000000000 --- a/contrib/attic/scalnet/build.sbt +++ /dev/null @@ -1,80 +0,0 @@ -import scala.sys.process._ - -name := "ScalNet" -version := "1.0.0-SNAPSHOT" -description := "A Scala wrapper for Deeplearning4j, inspired by Keras. Scala + DL + Spark + GPUs" - -scalaVersion := "2.11.12" - -resolvers in ThisBuild ++= Seq( - Resolver.sonatypeRepo("snapshots") -) - -cleanFiles += baseDirectory.value / "lib" -val mvnInstall = Seq("mvn", "install", "-q", "-f", "sbt-pom.xml") -val operatingSystem = sys.props("os.name").toLowerCase.substring(0, 3) -update := { - operatingSystem match { - case "win" => { Seq("cmd", "/C") ++ mvnInstall !; update.value } - case _ => { mvnInstall !; update.value } - } -} - -libraryDependencies ++= { - - val dl4j = "1.0.0-SNAPSHOT" - val logback = "1.2.3" - val scalaCheck = "1.13.5" - val scalaTest = "3.0.5" - - Seq( - "org.deeplearning4j" % "deeplearning4j-core" % dl4j, - "org.slf4j" % "slf4j-api" % "1.7.25", - "ch.qos.logback" % "logback-classic" % logback, - "org.nd4j" % "nd4j-native" % dl4j % "test", - "org.scalacheck" %% "scalacheck" % scalaCheck % "test", - "org.scalatest" %% "scalatest" % scalaTest % "test" - ) -} - -scalacOptions in ThisBuild ++= Seq("-language:postfixOps", - "-language:implicitConversions", - "-language:existentials", - "-feature", - "-deprecation") - -lazy val standardSettings = Seq( - organization := "org.deeplearning4j", - organizationName := "Skymind", - startYear := Some(2016), - licenses += ("Apache-2.0", url("http://www.apache.org/licenses/LICENSE-2.0.html")), - homepage := Some(url("https://github.com/deeplearning4j/ScalNet")), - crossScalaVersions := Seq("2.11.12", "2.10.7"), - scalacOptions ++= Seq( - "-encoding", - "UTF-8", - "-Xlint", - "-deprecation", - "-Xfatal-warnings", - "-feature", - "-language:postfixOps", - "-unchecked" - ) -) - -parallelExecution in Test := false -scalafmtOnCompile in ThisBuild := true -scalafmtTestOnCompile in ThisBuild := true -test in assembly := {} -assemblyMergeStrategy in assembly := { - case PathList("META-INF", xs @ _*) => MergeStrategy.discard - case x => MergeStrategy.first -} - -lazy val root = (project in file(".")) - .enablePlugins(AutomateHeaderPlugin) - .settings(standardSettings) - .settings( - name := "ScalNet", - fork := true - ) diff --git a/contrib/attic/scalnet/buildmultiplescalaversions.sh b/contrib/attic/scalnet/buildmultiplescalaversions.sh deleted file mode 100644 index 87482f719..000000000 --- a/contrib/attic/scalnet/buildmultiplescalaversions.sh +++ /dev/null @@ -1,57 +0,0 @@ -#! /bin/bash -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -BASEDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" - -function echoError() { - (>&2 echo "$1") -} - -function scalaError() { - echoError "Changing Scala major version to 2.10 in the build did not change the state of your working copy, is Scala 2.11 still the default ?" - exit 2 -} - -function whatchanged() { - cd "$BASEDIR" - for i in $(git status -s --porcelain -- $(find ./ -mindepth 2 -name pom.xml)|awk '{print $2}'); do - echo "$(dirname $i)" - cd "$BASEDIR" - done -} - -set -eu -./change-scala-versions.sh 2.11 # should be idempotent, this is the default -mvn "$@" -./change-scala-versions.sh 2.10 -if [ -z "$(whatchanged)" ]; then - scalaError; -else - if [[ "${@#-pl}" = "$@" ]]; then - mvn -Dmaven.clean.skip=true -pl $(whatchanged| tr '\n' ',') -amd "$@" - else - # the arguments already tweak the project list ! don't tweak them more - # as this can lead to conflicts (excluding a project that's not part of - # the reactor) - mvn "$@" - fi -fi -./change-scala-versions.sh 2.11 # back to the default diff --git a/contrib/attic/scalnet/pom.xml b/contrib/attic/scalnet/pom.xml deleted file mode 100644 index 1f0132785..000000000 --- a/contrib/attic/scalnet/pom.xml +++ /dev/null @@ -1,241 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j - 1.0.0-SNAPSHOT - - - scalnet_2.11 - - ScalNet - A Scala wrapper for Deeplearning4j, inspired by Keras. Scala + DL + Spark + GPUs - - - - - turambar - Dave Kale - - - maxpumperla - Max Pumperla - - - - - - 2.11.12 - 2.11 - - - - - org.scala-lang - scala-library - ${scala.version} - - - org.deeplearning4j - deeplearning4j-core - ${dl4j.version} - - - ch.qos.logback - logback-classic - ${logback.version} - - - org.scalatest - scalatest_${scala.binary.version} - ${scalatest.version} - test - - - org.scalacheck - scalacheck_${scala.binary.version} - ${scalacheck.version} - test - - - com.google.code.gson - gson - ${gson.version} - - - - - src/main/scala - src/test/scala - - - net.alchim31.maven - scala-maven-plugin - ${maven-scala-plugin.version} - - - - compile - testCompile - doc-jar - - - - - ${scala.version} - - -deprecation - -explaintypes - -nobootcp - - - - - org.apache.maven.plugins - maven-eclipse-plugin - 2.10 - - true - - ch.epfl.lamp.sdt.core.scalabuilder - - - ch.epfl.lamp.sdt.core.scalanature - - - org.eclipse.jdt.launching.JRE_CONTAINER - - ch.epfl.lamp.sdt.launching.SCALA_CONTAINER - - - - - - org.antipathy - mvn-scalafmt - 0.7_${scalafmt.version} - - ${project.basedir}/.scalafmt.conf - - - - validate - - format - - - - - - org.apache.maven.plugins - maven-surefire-plugin - ${maven-surefire-plugin.version} - - true - - - - org.scalatest - scalatest-maven-plugin - 1.0 - - ${project.build.directory}/surefire-reports - . - WDF TestSuite.txt - ${scala.test.skip} - - - - test - - test - - - - - - pl.project13.maven - git-commit-id-plugin - 2.2.2 - - - - revision - - package - - - - true - ${project.build.outputDirectory}/git.properties - - - true - - - - - - - - - - org.scala-tools - maven-scala-plugin - - ${scala.version} - - - - - - - - nd4j-tests-cpu - - - org.nd4j - nd4j-native - ${project.version} - test - - - - - nd4j-tests-cuda - - - org.nd4j - nd4j-cuda-11.0 - ${project.version} - test - - - - - diff --git a/contrib/attic/scalnet/project/assembly.sbt b/contrib/attic/scalnet/project/assembly.sbt deleted file mode 100644 index 652a3b93b..000000000 --- a/contrib/attic/scalnet/project/assembly.sbt +++ /dev/null @@ -1 +0,0 @@ -addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6") diff --git a/contrib/attic/scalnet/project/build.properties b/contrib/attic/scalnet/project/build.properties deleted file mode 100644 index bcf691bb1..000000000 --- a/contrib/attic/scalnet/project/build.properties +++ /dev/null @@ -1,21 +0,0 @@ -# -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ -# - -sbt.version=1.1.1 diff --git a/contrib/attic/scalnet/project/plugins.sbt b/contrib/attic/scalnet/project/plugins.sbt deleted file mode 100644 index bc4756a6c..000000000 --- a/contrib/attic/scalnet/project/plugins.sbt +++ /dev/null @@ -1,5 +0,0 @@ -resolvers += Classpaths.sbtPluginReleases - -addSbtPlugin("com.lucidchart" % "sbt-scalafmt" % "1.15") - -addSbtPlugin("de.heikoseeberger" % "sbt-header" % "5.0.0") diff --git a/contrib/attic/scalnet/sbt-pom.xml b/contrib/attic/scalnet/sbt-pom.xml deleted file mode 100644 index e12d1a29c..000000000 --- a/contrib/attic/scalnet/sbt-pom.xml +++ /dev/null @@ -1,89 +0,0 @@ - - - - 4.0.0 - org.deeplearning4j - nd4j-native-dependencies - 1.0.0-SNAPSHOT - - Minimal POM to install nd4j-native dependencies - - - 1.0.0-SNAPSHOT - - - - - sonatype-nexus-snapshots - Sonatype Nexus Snapshots - https://oss.sonatype.org/content/repositories/snapshots - - false - - - true - - - - - - - org.nd4j - nd4j-native - ${nd4j.version} - - - - - - - maven-dependency-plugin - 3.0.2 - - - install - - copy-dependencies - - - ${project.basedir}/lib - compile - false - false - true - - - - - - org.apache.maven.plugins - maven-surefire-plugin - 2.7 - - true - - - - - - - diff --git a/contrib/attic/scalnet/src/main/resources/logback.xml b/contrib/attic/scalnet/src/main/resources/logback.xml deleted file mode 100644 index 36b0cc70f..000000000 --- a/contrib/attic/scalnet/src/main/resources/logback.xml +++ /dev/null @@ -1,31 +0,0 @@ - - - - - - %d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n - - - - - - - diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/ELU.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/ELU.scala deleted file mode 100644 index d93a3dd17..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/ELU.scala +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.advanced.activations - -import org.deeplearning4j.nn.conf.layers.{ ActivationLayer => JActivationLayer } -import org.deeplearning4j.scalnet.layers.core.Layer -import org.nd4j.linalg.activations.impl.{ ActivationELU } - -/** - * ELU layer - * - * @author Max Pumperla - */ -class ELU(alpha: Double, nOut: Option[List[Int]], nIn: Option[List[Int]], override val name: String = "") - extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JActivationLayer.Builder() - .activation(new ActivationELU(alpha)) - .name(name) - .build() - - override val outputShape: List[Int] = nOut.getOrElse(List(0)) - override val inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def reshapeInput(newIn: List[Int]): ELU = - new ELU(alpha, Some(newIn), Some(newIn), name) -} - -object ELU { - def apply(alpha: Double, nOut: Int = 0, nIn: Int = 0, name: String = ""): ELU = - new ELU(alpha, Some(List(nOut)), Some(List(nIn)), name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/LeakyReLU.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/LeakyReLU.scala deleted file mode 100644 index f39d58ebb..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/LeakyReLU.scala +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.advanced.activations - -import org.deeplearning4j.nn.conf.layers.{ ActivationLayer => JActivationLayer } -import org.deeplearning4j.scalnet.layers.core.Layer -import org.nd4j.linalg.activations.impl.ActivationLReLU - -/** - * LeakyReLU layer - * - * @author Max Pumperla - */ -class LeakyReLU(alpha: Double, nOut: Option[List[Int]], nIn: Option[List[Int]], override val name: String = "") - extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JActivationLayer.Builder() - .activation(new ActivationLReLU(alpha)) - .name(name) - .build() - - override val outputShape: List[Int] = nOut.getOrElse(List(0)) - override val inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def reshapeInput(newIn: List[Int]): LeakyReLU = - new LeakyReLU(alpha, Some(newIn), Some(newIn), name) -} - -object LeakyReLU { - def apply(alpha: Double, nOut: Int = 0, nIn: Int = 0, name: String = ""): LeakyReLU = - new LeakyReLU(alpha, Some(List(nOut)), Some(List(nIn)), name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/ReLU.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/ReLU.scala deleted file mode 100644 index d54b9abc3..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/ReLU.scala +++ /dev/null @@ -1,47 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.advanced.activations - -import org.deeplearning4j.nn.conf.layers.{ ActivationLayer => JActivationLayer } -import org.deeplearning4j.scalnet.layers.core.Layer -import org.nd4j.linalg.activations.impl.ActivationReLU - -/** - * ReLU layer - * - * @author Max Pumperla - */ -class ReLU(nOut: Option[List[Int]], nIn: Option[List[Int]], override val name: String = "") extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JActivationLayer.Builder() - .activation(new ActivationReLU()) - .name(name) - .build() - - override val outputShape: List[Int] = nOut.getOrElse(List(0)) - override val inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def reshapeInput(newIn: List[Int]): ReLU = - new ReLU(Some(newIn), Some(newIn), name) -} - -object ReLU { - def apply(nOut: Int = 0, nIn: Int = 0, name: String = ""): ReLU = - new ReLU(Some(List(nOut)), Some(List(nIn)), name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/Softmax.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/Softmax.scala deleted file mode 100644 index 9de4954e4..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/advanced/activations/Softmax.scala +++ /dev/null @@ -1,47 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.advanced.activations - -import org.deeplearning4j.nn.conf.layers.{ ActivationLayer => JActivationLayer } -import org.deeplearning4j.scalnet.layers.core.Layer -import org.nd4j.linalg.activations.impl.{ ActivationSoftmax } - -/** - * Softmax layer - * - * @author Max Pumperla - */ -class Softmax(nOut: Option[List[Int]], nIn: Option[List[Int]], override val name: String = "") extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JActivationLayer.Builder() - .activation(new ActivationSoftmax()) - .name(name) - .build() - - override val outputShape: List[Int] = nOut.getOrElse(List(0)) - override val inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def reshapeInput(newIn: List[Int]): Softmax = - new Softmax(Some(newIn), Some(newIn), name) -} - -object Softmax { - def apply(alpha: Double, nOut: Int = 0, nIn: Int = 0, name: String = ""): Softmax = - new Softmax(Some(List(nOut)), Some(List(nIn)), name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution.scala deleted file mode 100644 index 0fb6d046a..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution.scala +++ /dev/null @@ -1,86 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.inputs.InvalidInputTypeException -import org.deeplearning4j.scalnet.layers.core.Node -import org.deeplearning4j.util.ConvolutionUtils - -/** - * Base class for convolutional layers. - * - * @author David Kale, Max Pumperla - */ -abstract class Convolution(protected val dimension: Int, - protected val kernelSize: List[Int], - protected val stride: List[Int], - protected val padding: List[Int], - protected val dilation: List[Int], - protected val nChannels: Int = 0, - protected val nIn: Option[List[Int]] = None, - protected val nFilter: Int = 0) - extends Node { - - override def inputShape: List[Int] = nIn.getOrElse(List(nChannels)) - - if (kernelSize.lengthCompare(dimension) != 0 - || kernelSize.lengthCompare(stride.length) != 0 - || kernelSize.lengthCompare(padding.length) != 0 - || kernelSize.lengthCompare(dilation.length) != 0) { - throw new IllegalArgumentException("Kernel, stride, dilation and padding must all have same shape.") - } - - private def validateShapes(dimension: Int, - inShape: List[Int], - kernelSize: List[Int], - stride: List[Int], - padding: List[Int], - dilation: List[Int]): Unit = - for (i <- 0 until dimension) { - if (kernelSize(i) > (inShape(i) + 2 * padding(i))) - throw new InvalidInputTypeException( - s"Invalid input: activations into layer are $inShape but kernel size is $kernelSize with padding $padding" - ) - - if (stride(i) <= 0) - throw new InvalidInputTypeException( - s"Invalid stride: all $stride elements should be great than 0" - ) - - if (dilation(i) <= 0) - throw new InvalidInputTypeException( - s"Invalid stride: all $dilation elements should be great than 0" - ) - } - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (nFilter > 0) nFilter - else if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(dimension + 1) == 0) { - validateShapes(dimension, inputShape, kernelSize, stride, padding, dilation) - val effectiveKernel: Array[Int] = ConvolutionUtils.effectiveKernelSize(kernelSize.toArray, dilation.toArray) - - // TODO: border modes - List[List[Int]](inputShape.init, effectiveKernel.toList, padding, stride, dilation).transpose - .map(x => (x.head - x(1) + 2 * x(2)) / x(3) + 1) :+ nOutChannels - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution1D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution1D.scala deleted file mode 100644 index 973be8486..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution1D.scala +++ /dev/null @@ -1,100 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.{ Convolution1DLayer } -import org.deeplearning4j.nn.weights.WeightInit -import org.deeplearning4j.scalnet.layers.core.Layer -import org.deeplearning4j.scalnet.regularizers.{ NoRegularizer, WeightRegularizer } -import org.nd4j.linalg.activations.Activation - -/** - * 1D convolution for structured image-like inputs. Input should have - * two dimensions: height and number of channels. Convolution is over height only. - * - * @author Max Pumperla - */ -class Convolution1D(nFilter: Int, - kernelSize: List[Int], - nChannels: Int = 0, - stride: List[Int] = List(1), - padding: List[Int] = List(0), - dilation: List[Int] = List(1), - nIn: Option[List[Int]] = None, - val weightInit: WeightInit = WeightInit.XAVIER_UNIFORM, - val activation: Activation = Activation.IDENTITY, - val regularizer: WeightRegularizer = NoRegularizer(), - val dropOut: Double = 0.0, - override val name: String = "") - extends Convolution(dimension = 1, kernelSize, stride, padding, dilation, nChannels, nIn, nFilter) - with Layer { - - override def reshapeInput(nIn: List[Int]): Convolution1D = - new Convolution1D(nFilter, - kernelSize, - nChannels, - stride, - padding, - dilation, - Some(nIn), - weightInit, - activation, - regularizer, - dropOut, - name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new Convolution1DLayer.Builder(kernelSize.head, kernelSize.last) - .nIn(inputShape.last) - .nOut(outputShape.last) - .stride(stride.head) - .padding(padding.head) - .dilation(dilation.head) - .weightInit(weightInit) - .activation(activation) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() -} - -object Convolution1D { - def apply(nFilter: Int, - kernelSize: List[Int], - nChannels: Int = 0, - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - weightInit: WeightInit = WeightInit.XAVIER_UNIFORM, - activation: Activation = Activation.IDENTITY, - regularizer: WeightRegularizer = NoRegularizer(), - dropOut: Double = 0.0): Convolution1D = - new Convolution1D(nFilter, - kernelSize, - nChannels, - stride, - padding, - dilation, - nIn, - weightInit, - activation, - regularizer, - dropOut) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution2D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution2D.scala deleted file mode 100644 index b787a0152..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution2D.scala +++ /dev/null @@ -1,101 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.ConvolutionLayer -import org.deeplearning4j.nn.weights.WeightInit -import org.deeplearning4j.scalnet.layers.core.Layer -import org.deeplearning4j.scalnet.regularizers.{ NoRegularizer, WeightRegularizer } -import org.nd4j.linalg.activations.Activation - -/** - * 2D convolution for structured image-like inputs. Input should have - * three dimensions: height (number of rows), width (number of columns), - * and number of channels. Convolution is over height and width. - * - * @author David Kale, Max Pumperla - */ -class Convolution2D(nFilter: Int, - kernelSize: List[Int], - nChannels: Int = 0, - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - val weightInit: WeightInit = WeightInit.XAVIER_UNIFORM, - val activation: Activation = Activation.IDENTITY, - val regularizer: WeightRegularizer = NoRegularizer(), - val dropOut: Double = 0.0, - override val name: String = "") - extends Convolution(dimension = 2, kernelSize, stride, padding, dilation, nChannels, nIn, nFilter) - with Layer { - - override def reshapeInput(nIn: List[Int]): Convolution2D = - new Convolution2D(nFilter, - kernelSize, - nChannels, - stride, - padding, - dilation, - Some(nIn), - weightInit, - activation, - regularizer, - dropOut, - name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new ConvolutionLayer.Builder(kernelSize.head, kernelSize.last) - .nIn(inputShape.last) - .nOut(outputShape.last) - .stride(stride.head, stride.last) - .padding(padding.head, padding.last) - .dilation(dilation.head, dilation.last) - .weightInit(weightInit) - .activation(activation) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() -} - -object Convolution2D { - def apply(nFilter: Int, - kernelSize: List[Int], - nChannels: Int = 0, - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - weightInit: WeightInit = WeightInit.XAVIER_UNIFORM, - activation: Activation = Activation.IDENTITY, - regularizer: WeightRegularizer = NoRegularizer(), - dropOut: Double = 0.0): Convolution2D = - new Convolution2D(nFilter, - kernelSize, - nChannels, - stride, - padding, - dilation, - nIn, - weightInit, - activation, - regularizer, - dropOut) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution3D.scala deleted file mode 100644 index 79dbc42d2..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution3D.scala +++ /dev/null @@ -1,104 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.{ Convolution3D => JConvolution3D } -import org.deeplearning4j.nn.conf.layers.Convolution3D.DataFormat -import org.deeplearning4j.nn.weights.WeightInit -import org.deeplearning4j.scalnet.layers.core.Layer -import org.deeplearning4j.scalnet.regularizers.{ NoRegularizer, WeightRegularizer } -import org.nd4j.linalg.activations.Activation - -/** - * 3D convolution for structured image-like inputs. Input should have - * four dimensions: depth, height, width - * and number of channels. Convolution is over depth, height and width. - * For simplicity we assume NDHWC data format, i.e. channels last. - * - * @author Max Pumperla - */ -class Convolution3D(nFilter: Int, - kernelSize: List[Int], - nChannels: Int = 0, - stride: List[Int] = List(1, 1, 1), - padding: List[Int] = List(0, 0, 0), - dilation: List[Int] = List(1, 1, 1), - nIn: Option[List[Int]] = None, - val weightInit: WeightInit = WeightInit.XAVIER_UNIFORM, - val activation: Activation = Activation.IDENTITY, - val regularizer: WeightRegularizer = NoRegularizer(), - val dropOut: Double = 0.0, - override val name: String = "") - extends Convolution(dimension = 3, kernelSize, stride, padding, dilation, nChannels, nIn, nFilter) - with Layer { - - override def reshapeInput(nIn: List[Int]): Convolution3D = - new Convolution3D(nFilter, - kernelSize, - nChannels, - stride, - padding, - dilation, - Some(nIn), - weightInit, - activation, - regularizer, - dropOut, - name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JConvolution3D.Builder(kernelSize.head, kernelSize(1), kernelSize(2)) - .nIn(inputShape.last) - .nOut(outputShape.last) - .dataFormat(DataFormat.NDHWC) - .stride(stride.head, stride(1), stride(2)) - .padding(padding.head, padding(1), padding(2)) - .dilation(dilation.head, dilation(1), dilation(2)) - .weightInit(weightInit) - .activation(activation) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() -} - -object Convolution3D { - def apply(nFilter: Int, - kernelSize: List[Int], - nChannels: Int = 0, - stride: List[Int] = List(1, 1, 1), - padding: List[Int] = List(0, 0, 0), - dilation: List[Int] = List(1, 1, 1), - nIn: Option[List[Int]] = None, - weightInit: WeightInit = WeightInit.XAVIER_UNIFORM, - activation: Activation = Activation.IDENTITY, - regularizer: WeightRegularizer = NoRegularizer(), - dropOut: Double = 0.0): Convolution3D = - new Convolution3D(nFilter, - kernelSize, - nChannels, - stride, - padding, - dilation, - nIn, - weightInit, - activation, - regularizer, - dropOut) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping1D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping1D.scala deleted file mode 100644 index d54f0d486..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping1D.scala +++ /dev/null @@ -1,56 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.convolutional.{ Cropping1D => JCropping1D } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 1D cropping layer - * - * @author Max Pumperla - */ -class Cropping1D(cropLeftH: Int, cropRightH: Int, nIn: List[Int], override val name: String = "") - extends Node - with Layer { - - override def inputShape: List[Int] = nIn - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(2) == 0) { - List[Int](inputShape.head - cropLeftH - cropRightH, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): Cropping1D = - new Cropping1D(cropLeftH, cropRightH, nIn, name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JCropping1D.Builder(cropLeftH, cropRightH) - .name(name) - .build() -} - -object Cropping1D { - def apply(cropLeftH: Int, cropRightH: Int, nIn: List[Int], name: String): Cropping1D = - new Cropping1D(cropLeftH, cropRightH, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping2D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping2D.scala deleted file mode 100644 index ba986619a..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping2D.scala +++ /dev/null @@ -1,66 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.convolutional.{ Cropping2D => JCropping2D } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 2D cropping layer - * - * @author Max Pumperla - */ -class Cropping2D(cropLeftH: Int, - cropRightH: Int, - cropLeftW: Int, - cropRightW: Int, - nIn: List[Int], - override val name: String = "") - extends Node - with Layer { - - override def inputShape: List[Int] = nIn - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(3) == 0) { - List[Int](inputShape.head - cropLeftH - cropRightH, inputShape(1) - cropLeftW - cropRightW, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): Cropping2D = - new Cropping2D(cropLeftH, cropRightH, cropLeftW, cropRightW, nIn, name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JCropping2D.Builder(cropLeftH, cropRightH, cropLeftW, cropRightW) - .name(name) - .build() -} - -object Cropping2D { - def apply(cropLeftH: Int, - cropRightH: Int, - cropLeftW: Int, - cropRightW: Int, - nIn: List[Int], - name: String): Cropping2D = - new Cropping2D(cropLeftH, cropRightH, cropLeftW, cropRightW, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping3D.scala deleted file mode 100644 index d47e39dc1..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Cropping3D.scala +++ /dev/null @@ -1,73 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.convolutional.{ Cropping3D => JCropping3D } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 3D cropping layer - * - * @author Max Pumperla - */ -class Cropping3D(cropLeftD: Int, - cropRightD: Int, - cropLeftH: Int, - cropRightH: Int, - cropLeftW: Int, - cropRightW: Int, - nIn: List[Int], - override val name: String = "") - extends Node - with Layer { - - override def inputShape: List[Int] = nIn - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(4) == 0) { - List[Int](inputShape.head - cropLeftD - cropRightD, - inputShape(1) - cropLeftH - cropRightH, - inputShape(2) - cropLeftW - cropRightW, - nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): Cropping3D = - new Cropping3D(cropLeftD, cropRightD, cropLeftH, cropRightH, cropLeftW, cropRightW, nIn, name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JCropping3D.Builder(cropLeftD, cropRightD, cropLeftH, cropRightH, cropLeftW, cropRightW) - .name(name) - .build() -} - -object Cropping3D { - def apply(cropLeftD: Int, - cropRightD: Int, - cropLeftH: Int, - cropRightH: Int, - cropLeftW: Int, - cropRightW: Int, - nIn: List[Int], - name: String): Cropping3D = - new Cropping3D(cropLeftD, cropRightD, cropLeftH, cropRightH, cropLeftW, cropRightW, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling.scala deleted file mode 100644 index ecd2e30b5..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling.scala +++ /dev/null @@ -1,47 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.scalnet.layers.core.Node - -/** - * Base upsampling layer - * - * @author Max Pumperla - */ -class Upsampling(protected val dimension: Int, - protected val size: List[Int], - protected val nChannels: Int = 0, - protected val nIn: Option[List[Int]] = None, - override val name: String = "") - extends Node { - - override def inputShape: List[Int] = nIn.getOrElse(List(nChannels)) - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(dimension + 1) == 0) { - List[List[Int]](inputShape.init, size).transpose - .map(x => x.head * x(1)) :+ nOutChannels - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling1D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling1D.scala deleted file mode 100644 index 5db56a9cb..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling1D.scala +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.{ Upsampling1D => JUpsampling1D } -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 1D upsampling layer - * - * @author Max Pumperla - */ -class Upsampling1D(size: List[Int], nChannels: Int = 0, nIn: Option[List[Int]] = None, override val name: String = "") - extends Upsampling(dimension = 1, size, nChannels, nIn, name) - with Layer { - if (size.length != 1) { - throw new IllegalArgumentException("Size must be length 1.") - } - - override def reshapeInput(nIn: List[Int]): Upsampling1D = - new Upsampling1D(size, nChannels, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JUpsampling1D.Builder() - .size(size.toArray) - .name(name) - .build() -} - -object Upsampling1D { - def apply(size: List[Int], nChannels: Int = 0, nIn: Option[List[Int]] = None): Upsampling1D = - new Upsampling1D(size, nChannels, nIn) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling2D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling2D.scala deleted file mode 100644 index 1d36f4dac..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling2D.scala +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.{ Upsampling2D => JUpsampling2D } -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 2D upsampling layer - * - * @author Max Pumperla - */ -class Upsampling2D(size: List[Int], nChannels: Int = 0, nIn: Option[List[Int]] = None, override val name: String = "") - extends Upsampling(dimension = 2, size, nChannels, nIn, name) - with Layer { - if (size.length != 2) { - throw new IllegalArgumentException("Size must be length 2.") - } - - override def reshapeInput(nIn: List[Int]): Upsampling2D = - new Upsampling2D(size, nChannels, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JUpsampling2D.Builder() - .size(size.toArray) - .name(name) - .build() -} - -object Upsampling2D { - def apply(size: List[Int], nChannels: Int = 0, nIn: Option[List[Int]] = None): Upsampling2D = - new Upsampling2D(size, nChannels, nIn) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling3D.scala deleted file mode 100644 index 35a25b233..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/Upsampling3D.scala +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.{ Upsampling3D => JUpsampling3D } -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 3D upsampling layer - * - * @author Max Pumperla - */ -class Upsampling3D(size: List[Int], nChannels: Int = 0, nIn: Option[List[Int]] = None, override val name: String = "") - extends Upsampling(dimension = 3, size, nChannels, nIn, name) - with Layer { - if (size.length != 3) { - throw new IllegalArgumentException("Size must be length 3.") - } - - override def reshapeInput(nIn: List[Int]): Upsampling3D = - new Upsampling3D(size, nChannels, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JUpsampling3D.Builder() - .size(size.toArray) - .name(name) - .build() -} - -object Upsampling3D { - def apply(size: List[Int], nChannels: Int = 0, nIn: Option[List[Int]] = None): Upsampling3D = - new Upsampling3D(size, nChannels, nIn) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding1D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding1D.scala deleted file mode 100644 index 37fc0777b..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding1D.scala +++ /dev/null @@ -1,56 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.ZeroPadding1DLayer -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 1D zero padding layer - * - * @author Max Pumperla - */ -class ZeroPadding1D(padLeftH: Int, padRightH: Int, nIn: List[Int], override val name: String = "") - extends Node - with Layer { - - override def inputShape: List[Int] = nIn - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(2) == 0) { - List[Int](inputShape.head + padLeftH + padRightH, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): ZeroPadding1D = - new ZeroPadding1D(padLeftH, padRightH, nIn, name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new ZeroPadding1DLayer.Builder(padLeftH, padRightH) - .name(name) - .build() -} - -object ZeroPadding1D { - def apply(padLeftH: Int, padRightH: Int, nIn: List[Int], name: String): ZeroPadding1D = - new ZeroPadding1D(padLeftH, padRightH, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding2D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding2D.scala deleted file mode 100644 index 07a06a601..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding2D.scala +++ /dev/null @@ -1,61 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.ZeroPaddingLayer -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 2D zero padding layer - * - * @author Max Pumperla - */ -class ZeroPadding2D(padLeftH: Int, - padRightH: Int, - padLeftW: Int, - padRightW: Int, - nIn: List[Int], - override val name: String = "") - extends Node - with Layer { - - override def inputShape: List[Int] = nIn - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(3) == 0) { - List[Int](inputShape.head + padLeftH + padRightH, inputShape(1) + padLeftW + padRightW, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): ZeroPadding2D = - new ZeroPadding2D(padLeftH, padRightH, padLeftW, padRightW, nIn, name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new ZeroPaddingLayer.Builder(padLeftH, padRightH, padLeftW, padRightW) - .name(name) - .build() -} - -object ZeroPadding2D { - def apply(padLeftH: Int, padRightH: Int, padLeftW: Int, padRightW: Int, nIn: List[Int], name: String): ZeroPadding2D = - new ZeroPadding2D(padLeftH, padRightH, padLeftW, padRightW, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding3D.scala deleted file mode 100644 index 1c3450020..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/convolutional/ZeroPadding3D.scala +++ /dev/null @@ -1,73 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.ZeroPadding3DLayer -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 3D zero padding layer - * - * @author Max Pumperla - */ -class ZeroPadding3D(padLeftD: Int, - padRightD: Int, - padLeftH: Int, - padRightH: Int, - padLeftW: Int, - padRightW: Int, - nIn: List[Int], - override val name: String = "") - extends Node - with Layer { - - override def inputShape: List[Int] = nIn - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(4) == 0) { - List[Int](inputShape.head + padLeftD + padRightD, - inputShape(1) + padLeftH + padRightH, - inputShape(2) + padLeftW + padRightW, - nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): ZeroPadding3D = - new ZeroPadding3D(padLeftD, padRightD, padLeftH, padRightH, padLeftW, padRightW, nIn, name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new ZeroPadding3DLayer.Builder(padLeftD, padRightD, padLeftH, padRightH, padLeftW, padRightW) - .name(name) - .build() -} - -object ZeroPadding3D { - def apply(padLeftD: Int, - padRightD: Int, - padLeftH: Int, - padRightH: Int, - padLeftW: Int, - padRightW: Int, - nIn: List[Int], - name: String): ZeroPadding3D = - new ZeroPadding3D(padLeftD, padRightD, padLeftH, padRightH, padLeftW, padRightW, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/ActivationLayer.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/ActivationLayer.scala deleted file mode 100644 index aaa48d406..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/ActivationLayer.scala +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -import org.deeplearning4j.nn.conf.layers.{ ActivationLayer => JActivationLayer } -import org.nd4j.linalg.activations.Activation - -/** - * Activation layer - * - * @author Max Pumperla - */ -class ActivationLayer(activation: Activation, - nOut: Option[List[Int]], - nIn: Option[List[Int]], - override val name: String = "") - extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new JActivationLayer.Builder() - .activation(activation) - .name(name) - .build() - - override val outputShape: List[Int] = nOut.getOrElse(List(0)) - override val inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def reshapeInput(newIn: List[Int]): ActivationLayer = - new ActivationLayer(activation, Some(newIn), Some(newIn), name) -} - -object ActivationLayer { - def apply(activation: Activation, nOut: Int = 0, nIn: Int = 0, name: String = ""): ActivationLayer = - new ActivationLayer(activation, Some(List(nOut)), Some(List(nIn)), name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Dense.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Dense.scala deleted file mode 100644 index 122f6fde9..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Dense.scala +++ /dev/null @@ -1,92 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -import org.deeplearning4j.nn.conf.layers.{ DenseLayer, OutputLayer => JOutputLayer } -import org.deeplearning4j.nn.weights.WeightInit -import org.deeplearning4j.scalnet.regularizers.{ NoRegularizer, WeightRegularizer } -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.lossfunctions.LossFunctions -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -/** - * Fully connected neural net layer. - * - * @author David Kale - */ -class Dense(nOut: List[Int], - nIn: List[Int], - weightInit: WeightInit, - activation: Activation, - regularizer: WeightRegularizer, - dropOut: Double, - override val name: String, - lossFunction: Option[LossFunction]) - extends OutputLayer { - - // Make this an output layer if lossFunction is defined. - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - if (output.isOutput) { - new JOutputLayer.Builder(output.lossFunction) - .nIn(inputShape.last) - .nOut(outputShape.last) - .weightInit(weightInit) - .activation(activation) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() - } else { - new DenseLayer.Builder() - .nIn(inputShape.last) - .nOut(outputShape.last) - .weightInit(weightInit) - .activation(activation) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() - } - - override val outputShape: List[Int] = nOut - - override val inputShape: List[Int] = nIn - - override val output: Output = - Output(isOutput = lossFunction.isDefined, lossFunction = lossFunction.orNull) - - override def reshapeInput(newIn: List[Int]): Dense = - new Dense(nOut, newIn, weightInit, activation, regularizer, dropOut, name, lossFunction) - - override def toOutputLayer(lossFunction: LossFunctions.LossFunction): OutputLayer = - new Dense(nOut, nIn, weightInit, activation, regularizer, dropOut, name, Some(lossFunction)) -} - -object Dense { - def apply(nOut: Int, - nIn: Int = 0, - weightInit: WeightInit = WeightInit.XAVIER_UNIFORM, - activation: Activation = Activation.IDENTITY, - regularizer: WeightRegularizer = NoRegularizer(), - dropOut: Double = 0.0, - name: String = "", - lossFunction: Option[LossFunction] = None): Dense = - new Dense(List(nOut), List(nIn), weightInit, activation, regularizer, dropOut, name, lossFunction) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Dropout.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Dropout.scala deleted file mode 100644 index 3d885cfa5..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Dropout.scala +++ /dev/null @@ -1,46 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core -import org.deeplearning4j.nn.conf.layers.DropoutLayer - -/** - * Dropout layer - * - * @author Max Pumperla - */ -class Dropout(nOut: List[Int], nIn: List[Int], rate: Double, override val name: String) extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new DropoutLayer.Builder(rate) - .nIn(inputShape.last) - .nOut(outputShape.last) - .name(name) - .build() - - override val outputShape: List[Int] = nOut - - override val inputShape: List[Int] = nIn - - override def reshapeInput(newIn: List[Int]): Dropout = - new Dropout(nOut, newIn, rate, name) -} - -object Dropout { - def apply(nOut: Int, nIn: Int = 0, rate: Double, name: String = ""): Dropout = - new Dropout(List(nOut), List(nIn), rate, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Layer.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Layer.scala deleted file mode 100644 index d0b037cfb..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Layer.scala +++ /dev/null @@ -1,30 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -import org.deeplearning4j.nn.conf.layers.{ Layer => JLayer } - -/** - * Trait for proper "layer" in DL4J neural networks and computational - * graphs. Compiles out to DL4J layer. - * - * @author David Kale - */ -trait Layer extends Node { - def compile: JLayer -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Node.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Node.scala deleted file mode 100644 index 723c04631..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Node.scala +++ /dev/null @@ -1,38 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -/** - * Trait for node in DL4J neural networks and computational graphs. - * Nodes are assumed to have inputs and outputs with "shapes." - * - * @author David Kale - */ -trait Node { - - def name: String - - def inputShape: List[Int] - - def outputShape: List[Int] - - def reshapeInput(nIn: List[Int]): Node = this - - def describe(): String = "in=" + inputShape + " out=" + outputShape - -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Output.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Output.scala deleted file mode 100644 index 2f2b33343..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Output.scala +++ /dev/null @@ -1,27 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -/** - * Trait for output layers in DL4J neural networks and computational graphs. - * - * @author David Kale - */ -case class Output(isOutput: Boolean, lossFunction: LossFunction) diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/OutputLayer.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/OutputLayer.scala deleted file mode 100644 index 1dec7ecb0..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/OutputLayer.scala +++ /dev/null @@ -1,33 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -import org.deeplearning4j.nn.conf.layers.{ OutputLayer => JOutputLayer } -import org.nd4j.linalg.lossfunctions.LossFunctions - -/** - * Extension of base layer, used to construct a DL4J OutputLayer after compilation. - * OutputLayer has an output object and the ability to return an OutputLayer version - * of itself, by providing a loss function. - * - * @author Max Pumperla - */ -trait OutputLayer extends Layer { - def output: Output - def toOutputLayer(lossFunction: LossFunctions.LossFunction): OutputLayer -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Preprocessor.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Preprocessor.scala deleted file mode 100644 index 4d8fd670d..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/Preprocessor.scala +++ /dev/null @@ -1,30 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -import org.deeplearning4j.nn.conf.InputPreProcessor - -/** - * Trait for preprocessing layers in DL4J neural networks and computational - * graphs. Compiles out to DL4J InputPreProcessor. - * - * @author David Kale - */ -trait Preprocessor extends Node { - def compile: InputPreProcessor -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/SpatialDropout.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/SpatialDropout.scala deleted file mode 100644 index 431b1336e..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/SpatialDropout.scala +++ /dev/null @@ -1,49 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -import org.deeplearning4j.nn.conf.dropout.{ SpatialDropout => JSpatialDropout } -import org.deeplearning4j.nn.conf.layers.DropoutLayer - -/** - * Spatial Dropout layer - * - * @author Max Pumperla - */ -class SpatialDropout(nOut: List[Int], nIn: List[Int], rate: Double, override val name: String) extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new DropoutLayer.Builder() - .dropOut(new JSpatialDropout(rate)) - .nIn(inputShape.last) - .nOut(outputShape.last) - .name(name) - .build() - - override val outputShape: List[Int] = nOut - - override val inputShape: List[Int] = nIn - - override def reshapeInput(newIn: List[Int]): SpatialDropout = - new SpatialDropout(nOut, newIn, rate, name) -} - -object SpatialDropout { - def apply(nOut: Int, nIn: Int = 0, rate: Double, name: String = ""): SpatialDropout = - new SpatialDropout(List(nOut), List(nIn), rate, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/WrapperLayer.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/WrapperLayer.scala deleted file mode 100644 index 77d316176..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/core/WrapperLayer.scala +++ /dev/null @@ -1,28 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -trait WrapperLayer extends Layer { - - def underlying: Layer - - override def inputShape: List[Int] = underlying.inputShape - - override def outputShape: List[Int] = underlying.outputShape - -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/embeddings/EmbeddingLayer.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/embeddings/EmbeddingLayer.scala deleted file mode 100644 index 64588d8a3..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/embeddings/EmbeddingLayer.scala +++ /dev/null @@ -1,67 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.embeddings - -import org.deeplearning4j.nn.conf.layers -import org.deeplearning4j.nn.weights.WeightInit -import org.deeplearning4j.scalnet.layers.core.Layer -import org.deeplearning4j.scalnet.regularizers.{ NoRegularizer, WeightRegularizer } -import org.nd4j.linalg.activations.Activation - -class EmbeddingLayer(nIn: Int, - nOut: Int, - activation: Activation, - weightInit: WeightInit, - regularizer: WeightRegularizer, - dropOut: Double = 0.0, - override val name: String = "") - extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new layers.EmbeddingLayer.Builder() - .nIn(nIn) - .nOut(nOut) - .activation(activation) - .weightInit(weightInit) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() - - override def inputShape: List[Int] = List(nIn, nOut) - - override def outputShape: List[Int] = List(nOut, nIn) -} - -object EmbeddingLayer { - def apply(nIn: Int, - nOut: Int, - activation: Activation = Activation.IDENTITY, - weightInit: WeightInit = WeightInit.XAVIER, - regularizer: WeightRegularizer = NoRegularizer(), - dropOut: Double = 0.0): EmbeddingLayer = - new EmbeddingLayer( - nIn, - nOut, - activation, - weightInit, - regularizer, - dropOut - ) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/AlphaDropout.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/AlphaDropout.scala deleted file mode 100644 index 8931b9751..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/AlphaDropout.scala +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.noise - -import org.deeplearning4j.nn.conf.dropout.{ AlphaDropout => JAlphaDropout } -import org.deeplearning4j.nn.conf.layers.DropoutLayer -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * AlphaDropout layer - * - * @author Max Pumperla - */ -class AlphaDropout(nOut: List[Int], nIn: List[Int], rate: Double, override val name: String) extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new DropoutLayer.Builder() - .dropOut(new JAlphaDropout(rate)) - .nIn(inputShape.last) - .nOut(outputShape.last) - .name(name) - .build() - - override val outputShape: List[Int] = nOut - - override val inputShape: List[Int] = nIn - - override def reshapeInput(newIn: List[Int]): AlphaDropout = - new AlphaDropout(nOut, newIn, rate, name) -} - -object AlphaDropout { - def apply(nOut: Int, nIn: Int = 0, rate: Double, name: String = ""): AlphaDropout = - new AlphaDropout(List(nOut), List(nIn), rate, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/GaussianDropout.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/GaussianDropout.scala deleted file mode 100644 index a1faef97f..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/GaussianDropout.scala +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.noise - -import org.deeplearning4j.nn.conf.dropout.{ GaussianDropout => JGaussianDropout } -import org.deeplearning4j.nn.conf.layers.DropoutLayer -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * GaussianDropout layer - * - * @author Max Pumperla - */ -class GaussianDropout(nOut: List[Int], nIn: List[Int], rate: Double, override val name: String) extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new DropoutLayer.Builder() - .dropOut(new JGaussianDropout(rate)) - .nIn(inputShape.last) - .nOut(outputShape.last) - .name(name) - .build() - - override val outputShape: List[Int] = nOut - - override val inputShape: List[Int] = nIn - - override def reshapeInput(newIn: List[Int]): GaussianDropout = - new GaussianDropout(nOut, newIn, rate, name) -} - -object GaussianDropout { - def apply(nOut: Int, nIn: Int = 0, rate: Double, name: String = ""): GaussianDropout = - new GaussianDropout(List(nOut), List(nIn), rate, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/GaussianNoise.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/GaussianNoise.scala deleted file mode 100644 index e29dd23ad..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/noise/GaussianNoise.scala +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.noise - -import org.deeplearning4j.nn.conf.dropout.{ GaussianNoise => JGaussianNoise } -import org.deeplearning4j.nn.conf.layers.DropoutLayer -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * GaussianNoise layer - * - * @author Max Pumperla - */ -class GaussianNoise(nOut: List[Int], nIn: List[Int], stddev: Double, override val name: String) extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new DropoutLayer.Builder() - .dropOut(new JGaussianNoise(stddev)) - .nIn(inputShape.last) - .nOut(outputShape.last) - .name(name) - .build() - - override val outputShape: List[Int] = nOut - - override val inputShape: List[Int] = nIn - - override def reshapeInput(newIn: List[Int]): GaussianNoise = - new GaussianNoise(nOut, newIn, stddev, name) -} - -object GaussianNoise { - def apply(nOut: Int, nIn: Int = 0, stddev: Double, name: String = ""): GaussianNoise = - new GaussianNoise(List(nOut), List(nIn), stddev, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling1D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling1D.scala deleted file mode 100644 index b02f56e66..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling1D.scala +++ /dev/null @@ -1,60 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.{ Subsampling1DLayer, SubsamplingLayer } -import org.deeplearning4j.scalnet.layers.convolutional.Convolution -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 1D average pooling layer in neural net architectures. - * - * @author Max Pumperla - */ -class AvgPooling1D(kernelSize: List[Int], - stride: List[Int] = List(1), - padding: List[Int] = List(0), - dilation: List[Int] = List(1), - nIn: Option[List[Int]] = None, - override val name: String = "") - extends Convolution(dimension = 1, kernelSize, stride, padding, dilation, 0, nIn, 0) - with Layer { - if (kernelSize.length != 1 || stride.length != 1 || padding.length != 1 || dilation.length != 1) { - throw new IllegalArgumentException("Kernel, stride, padding and dilation lists must all be length 1.") - } - - override def reshapeInput(nIn: List[Int]): AvgPooling1D = - new AvgPooling1D(kernelSize, stride, padding, dilation, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new Subsampling1DLayer.Builder(SubsamplingLayer.PoolingType.AVG) - .kernelSize(kernelSize.head) - .stride(stride.head) - .name(name) - .build() -} - -object AvgPooling1D { - def apply(kernelSize: List[Int], - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - name: String = null): AvgPooling1D = - new AvgPooling1D(kernelSize, stride, padding, dilation, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling2D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling2D.scala deleted file mode 100644 index 2913afbbe..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling2D.scala +++ /dev/null @@ -1,61 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.SubsamplingLayer -import org.deeplearning4j.scalnet.layers.convolutional.Convolution -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 2D average pooling layer in neural net architectures. - * - * @author Max Pumperla - */ -class AvgPooling2D(kernelSize: List[Int], - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - override val name: String = "") - extends Convolution(dimension = 2, kernelSize, stride, padding, dilation, 0, nIn, 0) - with Layer { - if (kernelSize.length != 2 || stride.length != 2 || padding.length != 2 || dilation.length != 2) { - throw new IllegalArgumentException("Kernel, stride, padding and dilation lists must all be length 2.") - } - - override def reshapeInput(nIn: List[Int]): AvgPooling2D = - new AvgPooling2D(kernelSize, stride, padding, dilation, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new SubsamplingLayer.Builder(SubsamplingLayer.PoolingType.AVG) - .kernelSize(kernelSize.head, kernelSize.last) - .dilation(dilation.head, dilation.last) - .stride(stride.head, stride.last) - .name(name) - .build() -} - -object AvgPooling2D { - def apply(kernelSize: List[Int], - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - name: String = null): AvgPooling2D = - new AvgPooling2D(kernelSize, stride, padding, dilation, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling3D.scala deleted file mode 100644 index 8e1e35961..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling3D.scala +++ /dev/null @@ -1,61 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.Subsampling3DLayer -import org.deeplearning4j.scalnet.layers.convolutional.Convolution -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 3D average pooling layer in neural net architectures. - * - * @author Max Pumperla - */ -class AvgPooling3D(kernelSize: List[Int], - stride: List[Int] = List(1, 1, 1), - padding: List[Int] = List(0, 0, 0), - dilation: List[Int] = List(1, 1, 1), - nIn: Option[List[Int]] = None, - override val name: String = "") - extends Convolution(dimension = 3, kernelSize, stride, padding, dilation, 0, nIn, 0) - with Layer { - if (kernelSize.length != 3 || stride.length != 3 || padding.length != 3 || dilation.length != 3) { - throw new IllegalArgumentException("Kernel, stride, padding and dilation lists must all be length 3.") - } - - override def reshapeInput(nIn: List[Int]): AvgPooling3D = - new AvgPooling3D(kernelSize, stride, padding, dilation, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new Subsampling3DLayer.Builder() - .poolingType(Subsampling3DLayer.PoolingType.AVG) - .kernelSize(kernelSize.head, kernelSize(1), kernelSize(2)) - .stride(stride.head, stride(1), stride(2)) - .name(name) - .build() -} - -object AvgPooling3D { - def apply(kernelSize: List[Int], - stride: List[Int] = List(1, 1, 1), - padding: List[Int] = List(0, 0, 0), - dilation: List[Int] = List(1, 1, 1), - nIn: Option[List[Int]] = None, - name: String = null): AvgPooling3D = - new AvgPooling3D(kernelSize, stride, padding, dilation, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling1D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling1D.scala deleted file mode 100644 index bf75698ae..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling1D.scala +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.{ GlobalPoolingLayer, PoolingType } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 1D global avg pooling layer. - * - * @author Max Pumperla - */ -class GlobalAvgPooling1D(nIn: Option[List[Int]] = None, override val name: String = null) extends Node with Layer { - - override def inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(2) == 0) { - List[Int](inputShape.head, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): GlobalAvgPooling1D = - new GlobalAvgPooling1D(Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new GlobalPoolingLayer.Builder() - .poolingType(PoolingType.AVG) - .name(name) - .build() -} - -object GlobalAvgPooling1D { - def apply(nIn: Option[List[Int]] = None, name: String = null): GlobalAvgPooling1D = - new GlobalAvgPooling1D(nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling2D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling2D.scala deleted file mode 100644 index 830e1dbc7..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling2D.scala +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.{ GlobalPoolingLayer, PoolingType } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 2D global avg pooling layer. - * - * @author Max Pumperla - */ -class GlobalAvgPooling2D(nIn: Option[List[Int]] = None, override val name: String = null) extends Node with Layer { - - override def inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(3) == 0) { - List[Int](inputShape.head, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): GlobalAvgPooling2D = - new GlobalAvgPooling2D(Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new GlobalPoolingLayer.Builder() - .poolingType(PoolingType.AVG) - .name(name) - .build() -} - -object GlobalAvgPooling2D { - def apply(nIn: Option[List[Int]] = None, name: String = null): GlobalAvgPooling2D = - new GlobalAvgPooling2D(nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling3D.scala deleted file mode 100644 index ec62b9d62..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalAvgPooling3D.scala +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.{ GlobalPoolingLayer, PoolingType } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 3D global avg pooling layer. - * - * @author Max Pumperla - */ -class GlobalAvgPooling3D(nIn: Option[List[Int]] = None, override val name: String = null) extends Node with Layer { - - override def inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(4) == 0) { - List[Int](inputShape.head, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): GlobalAvgPooling3D = - new GlobalAvgPooling3D(Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new GlobalPoolingLayer.Builder() - .poolingType(PoolingType.AVG) - .name(name) - .build() -} - -object GlobalAvgPooling3D { - def apply(nIn: Option[List[Int]] = None, name: String = null): GlobalAvgPooling3D = - new GlobalAvgPooling3D(nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling1D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling1D.scala deleted file mode 100644 index d29aca6f6..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling1D.scala +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.{ GlobalPoolingLayer, PoolingType } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 1D global max pooling layer. - * - * @author Max Pumperla - */ -class GlobalMaxPooling1D(nIn: Option[List[Int]] = None, override val name: String = null) extends Node with Layer { - - override def inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(2) == 0) { - List[Int](inputShape.head, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): GlobalMaxPooling1D = - new GlobalMaxPooling1D(Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new GlobalPoolingLayer.Builder() - .poolingType(PoolingType.MAX) - .name(name) - .build() -} - -object GlobalMaxPooling1D { - def apply(nIn: Option[List[Int]] = None, name: String = null): GlobalMaxPooling1D = - new GlobalMaxPooling1D(nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling2D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling2D.scala deleted file mode 100644 index d6e36d0c1..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling2D.scala +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.{ GlobalPoolingLayer, PoolingType } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 2D global max pooling layer. - * - * @author Max Pumperla - */ -class GlobalMaxPooling2D(nIn: Option[List[Int]] = None, override val name: String = null) extends Node with Layer { - - override def inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(3) == 0) { - List[Int](inputShape.head, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): GlobalMaxPooling2D = - new GlobalMaxPooling2D(Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new GlobalPoolingLayer.Builder() - .poolingType(PoolingType.MAX) - .name(name) - .build() -} - -object GlobalMaxPooling2D { - def apply(nIn: Option[List[Int]] = None, name: String = null): GlobalMaxPooling2D = - new GlobalMaxPooling2D(nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling3D.scala deleted file mode 100644 index 570d5672c..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/GlobalMaxPooling3D.scala +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.{ GlobalPoolingLayer, PoolingType } -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } - -/** - * 3D global max pooling layer. - * - * @author Max Pumperla - */ -class GlobalMaxPooling3D(nIn: Option[List[Int]] = None, override val name: String = null) extends Node with Layer { - - override def inputShape: List[Int] = nIn.getOrElse(List(0)) - - override def outputShape: List[Int] = { - val nOutChannels: Int = - if (inputShape.nonEmpty) inputShape.last - else 0 - if (inputShape.lengthCompare(4) == 0) { - List[Int](inputShape.head, nOutChannels) - } else if (nOutChannels > 0) List(nOutChannels) - else List() - } - - override def reshapeInput(nIn: List[Int]): GlobalMaxPooling3D = - new GlobalMaxPooling3D(Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new GlobalPoolingLayer.Builder() - .poolingType(PoolingType.MAX) - .name(name) - .build() -} - -object GlobalMaxPooling3D { - def apply(nIn: Option[List[Int]] = None, name: String = null): GlobalMaxPooling3D = - new GlobalMaxPooling3D(nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling1D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling1D.scala deleted file mode 100644 index 56d756752..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling1D.scala +++ /dev/null @@ -1,60 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.{ Subsampling1DLayer, SubsamplingLayer } -import org.deeplearning4j.scalnet.layers.convolutional.Convolution -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 1D max pooling layer in neural net architectures. - * - * @author Max Pumperla - */ -class MaxPooling1D(kernelSize: List[Int], - stride: List[Int] = List(1), - padding: List[Int] = List(0), - dilation: List[Int] = List(1), - nIn: Option[List[Int]] = None, - override val name: String = "") - extends Convolution(dimension = 1, kernelSize, stride, padding, dilation, 0, nIn, 0) - with Layer { - if (kernelSize.length != 1 || stride.length != 1 || padding.length != 1 || dilation.length != 1) { - throw new IllegalArgumentException("Kernel, stride, padding and dilation lists must all be length 1.") - } - - override def reshapeInput(nIn: List[Int]): MaxPooling1D = - new MaxPooling1D(kernelSize, stride, padding, dilation, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new Subsampling1DLayer.Builder(SubsamplingLayer.PoolingType.MAX) - .kernelSize(kernelSize.head) - .stride(stride.head) - .name(name) - .build() -} - -object MaxPooling1D { - def apply(kernelSize: List[Int], - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - name: String = null): MaxPooling1D = - new MaxPooling1D(kernelSize, stride, padding, dilation, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling2D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling2D.scala deleted file mode 100644 index 1a1456356..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling2D.scala +++ /dev/null @@ -1,61 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.SubsamplingLayer -import org.deeplearning4j.scalnet.layers.convolutional.Convolution -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 2D max pooling in neural net architectures. - * - * @author David Kale, Max Pumperla - */ -class MaxPooling2D(kernelSize: List[Int], - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - override val name: String = null) - extends Convolution(dimension = 2, kernelSize, stride, padding, dilation, 0, nIn, 0) - with Layer { - if (kernelSize.length != 2 || stride.length != 2 || padding.length != 2 || dilation.length != 2) { - throw new IllegalArgumentException("Kernel, stride, padding and dilation lists must all be length 2.") - } - - override def reshapeInput(nIn: List[Int]): MaxPooling2D = - new MaxPooling2D(kernelSize, stride, padding, dilation, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new SubsamplingLayer.Builder(SubsamplingLayer.PoolingType.MAX) - .kernelSize(kernelSize.head, kernelSize.last) - .dilation(dilation.head, dilation.last) - .stride(stride.head, stride.last) - .name(name) - .build() -} - -object MaxPooling2D { - def apply(kernelSize: List[Int], - stride: List[Int] = List(1, 1), - padding: List[Int] = List(0, 0), - dilation: List[Int] = List(1, 1), - nIn: Option[List[Int]] = None, - name: String = null): MaxPooling2D = - new MaxPooling2D(kernelSize, stride, padding, dilation, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling3D.scala deleted file mode 100644 index be461c182..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling3D.scala +++ /dev/null @@ -1,61 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.Subsampling3DLayer -import org.deeplearning4j.scalnet.layers.convolutional.Convolution -import org.deeplearning4j.scalnet.layers.core.Layer - -/** - * 3D max pooling layer in neural net architectures. - * - * @author Max Pumperla - */ -class MaxPooling3D(kernelSize: List[Int], - stride: List[Int] = List(1, 1, 1), - padding: List[Int] = List(0, 0, 0), - dilation: List[Int] = List(1, 1, 1), - nIn: Option[List[Int]] = None, - override val name: String = "") - extends Convolution(dimension = 3, kernelSize, stride, padding, dilation, 0, nIn, 0) - with Layer { - if (kernelSize.length != 3 || stride.length != 3 || padding.length != 3 || dilation.length != 3) { - throw new IllegalArgumentException("Kernel, stride, padding and dilation lists must all be length 3.") - } - - override def reshapeInput(nIn: List[Int]): MaxPooling3D = - new MaxPooling3D(kernelSize, stride, padding, dilation, Some(nIn), name) - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new Subsampling3DLayer.Builder() - .poolingType(Subsampling3DLayer.PoolingType.MAX) - .kernelSize(kernelSize.head, kernelSize(1), kernelSize(2)) - .stride(stride.head, stride(1), stride(2)) - .name(name) - .build() -} - -object MaxPooling3D { - def apply(kernelSize: List[Int], - stride: List[Int] = List(1, 1, 1), - padding: List[Int] = List(0, 0, 0), - dilation: List[Int] = List(1, 1, 1), - nIn: Option[List[Int]] = None, - name: String = null): MaxPooling3D = - new MaxPooling3D(kernelSize, stride, padding, dilation, nIn, name) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/Bidirectional.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/Bidirectional.scala deleted file mode 100644 index 58e69b67f..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/Bidirectional.scala +++ /dev/null @@ -1,40 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.recurrent - -import org.deeplearning4j.nn.conf.layers -import org.deeplearning4j.nn.conf.layers.recurrent.Bidirectional.Mode -import org.deeplearning4j.scalnet.layers.core.{ Layer, WrapperLayer } - -class Bidirectional(layer: Layer, mode: Mode, override val name: String = "") extends WrapperLayer { - - val underlying: Layer = layer - - override def compile: layers.Layer = new layers.recurrent.Bidirectional(mode, underlying.compile) - -} - -object Bidirectional { - - val CONCAT = Mode.CONCAT - val ADD = Mode.ADD - val MUL = Mode.MUL - val AVERAGE = Mode.AVERAGE - - def apply(layer: Layer, mode: Mode = Mode.CONCAT): Bidirectional = new Bidirectional(layer, mode) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/GravesLSTM.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/GravesLSTM.scala deleted file mode 100644 index 472b81c37..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/GravesLSTM.scala +++ /dev/null @@ -1,76 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.recurrent - -import org.deeplearning4j.nn.conf.layers -import org.deeplearning4j.nn.weights.WeightInit -import org.deeplearning4j.scalnet.layers.core.Layer -import org.deeplearning4j.scalnet.regularizers.{ NoRegularizer, WeightRegularizer } -import org.nd4j.linalg.activations.Activation - -class GravesLSTM(nIn: Int, - nOut: Int, - activation: Activation, - forgetGateBiasInit: Double, - gateActivation: Activation, - weightInit: WeightInit, - regularizer: WeightRegularizer, - dropOut: Double, - override val name: String = "") - extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new layers.GravesLSTM.Builder() - .nIn(nIn) - .nOut(nOut) - .activation(activation) - .forgetGateBiasInit(forgetGateBiasInit) - .gateActivationFunction(gateActivation) - .weightInit(weightInit) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() - - override def inputShape: List[Int] = List(nIn, nOut) - - override def outputShape: List[Int] = List(nOut, nIn) - -} - -object GravesLSTM { - def apply(nIn: Int, - nOut: Int, - activation: Activation = Activation.IDENTITY, - forgetGateBiasInit: Double = 1.0, - gateActivationFn: Activation = Activation.SIGMOID, - weightInit: WeightInit = WeightInit.XAVIER, - regularizer: WeightRegularizer = NoRegularizer(), - dropOut: Double = 0.0): GravesLSTM = - new GravesLSTM( - nIn, - nOut, - activation, - forgetGateBiasInit, - gateActivationFn, - weightInit, - regularizer, - dropOut - ) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/LSTM.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/LSTM.scala deleted file mode 100644 index a3c43a8c2..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/LSTM.scala +++ /dev/null @@ -1,76 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.recurrent - -import org.deeplearning4j.nn.conf.layers -import org.deeplearning4j.nn.weights.WeightInit -import org.deeplearning4j.scalnet.layers.core.Layer -import org.deeplearning4j.scalnet.regularizers.{ NoRegularizer, WeightRegularizer } -import org.nd4j.linalg.activations.Activation - -class LSTM(nIn: Int, - nOut: Int, - activation: Activation, - forgetGateBiasInit: Double, - gateActivation: Activation, - weightInit: WeightInit, - regularizer: WeightRegularizer, - dropOut: Double, - override val name: String = "") - extends Layer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - new layers.LSTM.Builder() - .nIn(nIn) - .nOut(nOut) - .activation(activation) - .forgetGateBiasInit(forgetGateBiasInit) - .gateActivationFunction(gateActivation) - .weightInit(weightInit) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() - - override def inputShape: List[Int] = List(nIn, nOut) - - override def outputShape: List[Int] = List(nOut, nIn) - -} - -object LSTM { - def apply(nIn: Int, - nOut: Int, - activation: Activation = Activation.IDENTITY, - forgetGateBiasInit: Double = 1.0, - gateActivationFn: Activation = Activation.SIGMOID, - weightInit: WeightInit = WeightInit.XAVIER, - regularizer: WeightRegularizer = NoRegularizer(), - dropOut: Double = 0.0): LSTM = - new LSTM( - nIn, - nOut, - activation, - forgetGateBiasInit, - gateActivationFn, - weightInit, - regularizer, - dropOut - ) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/RnnOutputLayer.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/RnnOutputLayer.scala deleted file mode 100644 index 693481e15..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/recurrent/RnnOutputLayer.scala +++ /dev/null @@ -1,103 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.recurrent - -import org.deeplearning4j.nn.conf.layers -import org.deeplearning4j.nn.conf.layers.{ OutputLayer => JOutputLayer } -import org.deeplearning4j.nn.weights.WeightInit -import org.deeplearning4j.scalnet.layers.core.{ Layer, Output, OutputLayer } -import org.deeplearning4j.scalnet.regularizers.{ NoRegularizer, WeightRegularizer } -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.lossfunctions.LossFunctions -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -class RnnOutputLayer(nIn: Int, - nOut: Int, - activation: Activation, - loss: Option[LossFunction], - weightInit: WeightInit, - regularizer: WeightRegularizer, - dropOut: Double, - override val name: String = "") - extends Layer - with OutputLayer { - - override def compile: org.deeplearning4j.nn.conf.layers.Layer = - Option(output.lossFunction) match { - case None => - new layers.RnnOutputLayer.Builder() - .nIn(nIn) - .nOut(nOut) - .activation(activation) - .weightInit(weightInit) - .l1(regularizer.l1) - .l2(regularizer.l2) - .dropOut(dropOut) - .name(name) - .build() - case _ => - new layers.RnnOutputLayer.Builder(output.lossFunction) - .nIn(nIn) - .nOut(nOut) - .activation(activation) - .weightInit(weightInit) - .l1(regularizer.l1) - .l2(regularizer.l2) - .lossFunction(output.lossFunction) - .dropOut(dropOut) - .name(name) - .build() - } - - override val inputShape: List[Int] = List(nIn, nOut) - - override val outputShape: List[Int] = List(nOut, nIn) - - override val output: Output = Output(isOutput = loss.isDefined, lossFunction = loss.orNull) - - override def toOutputLayer(lossFunction: LossFunctions.LossFunction): OutputLayer = - new RnnOutputLayer( - nIn, - nOut, - activation, - Some(lossFunction), - weightInit, - regularizer, - dropOut - ) -} - -object RnnOutputLayer { - def apply(nIn: Int, - nOut: Int, - activation: Activation, - loss: Option[LossFunction] = None, - weightInit: WeightInit = WeightInit.XAVIER, - regularizer: WeightRegularizer = NoRegularizer(), - dropOut: Double = 0.0): RnnOutputLayer = - new RnnOutputLayer( - nIn, - nOut, - activation, - loss, - weightInit, - regularizer, - dropOut - ) - -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Flatten3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Flatten3D.scala deleted file mode 100644 index 46680292f..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Flatten3D.scala +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.reshaping - -import org.deeplearning4j.nn.conf.InputPreProcessor -import org.deeplearning4j.nn.conf.preprocessor.CnnToFeedForwardPreProcessor -import org.deeplearning4j.scalnet.layers.core.Preprocessor - -/** - * Flattens structured image-like inputs into vector. Input should have - * three dimensions: height (number of rows), width (number of columns), - * and number of channels. - * - * @author David Kale - */ -class Flatten3D(nIn: List[Int] = List(0, 0, 0)) extends Preprocessor { - - override val inputShape: List[Int] = nIn - override val outputShape = List(inputShape.product) - override val name = "Flatten3D" - - override def reshapeInput(newIn: List[Int]): Flatten3D = - new Flatten3D(newIn) - - override def compile: InputPreProcessor = { - if (inputShape.length != 3) { - throw new IllegalArgumentException("Input shape must be length 3.") - } - new CnnToFeedForwardPreProcessor(inputShape.head, inputShape.tail.head, inputShape.last) - } -} - -object Flatten3D { - def apply(nIn: List[Int] = List(0, 0, 0)): Flatten3D = new Flatten3D(nIn) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Reshape.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Reshape.scala deleted file mode 100644 index 2e4272fda..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Reshape.scala +++ /dev/null @@ -1,88 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.reshaping - -import org.deeplearning4j.nn.conf.InputPreProcessor -import org.deeplearning4j.nn.conf.inputs.InputType -import org.deeplearning4j.nn.conf.preprocessor.BaseInputPreProcessor -import org.deeplearning4j.nn.workspace.LayerWorkspaceMgr -import org.deeplearning4j.scalnet.layers.core.Preprocessor -import org.nd4j.linalg.api.ndarray.INDArray - -/** - * Generic reshaping layer. - * - * @author David Kale - */ -class Reshape(newOutputShape: List[Int], oldInputShape: List[Int] = List()) extends Preprocessor { - override val inputShape: List[Int] = oldInputShape - override val outputShape: List[Int] = newOutputShape - override val name = "Reshape" - - override def reshapeInput(newIn: List[Int]): Reshape = - new Reshape(newOutputShape, newIn) - - private class ReshapePreProcessor(private var fromShape: Option[Array[Int]], - private val toShape: Array[Int], - private val dynamic: Boolean = true) - extends BaseInputPreProcessor - with Cloneable { - - override def preProcess(input: INDArray, miniBatchSize: Int, workspace: LayerWorkspaceMgr): INDArray = { - if (dynamic && fromShape != None) fromShape.get(0) = input.shape()(0).intValue() - if (input.shape().length == toShape.length) input else input.reshape(toShape) - } - - override def backprop(output: INDArray, miniBatchSize: Int, workspace: LayerWorkspaceMgr): INDArray = - if (fromShape == None || outputShape.length == fromShape.get.length) { - output - } else if (output.length() != fromShape.get.product) { - throw new IllegalStateException("Illegal shape") - } else { - output.reshape(fromShape.get) - } - - override def getOutputType(inputType: InputType): InputType = - toShape.length match { - case 2 | 3 => InputType.feedForward(toShape(1)) - case 4 => InputType.convolutional(toShape(3), toShape(2), toShape(1)) - case _ => throw new IllegalStateException("Output shape not understood.") - } - - } - - private object ReshapePreProcessor { - def apply(toShape: Int*): ReshapePreProcessor = - new ReshapePreProcessor(None, toShape.toArray, true) - } - - override def compile: InputPreProcessor = { - if (PartialFunction.cond(inputShape) { case Nil => true; case 0 :: Nil => true }) { - throw new IllegalArgumentException("Input shape must be nonempty and nonzero.") - } - if (inputShape.product != outputShape.product) { - throw new IllegalArgumentException("Overall input shape must equal overall output shape.") - } - new ReshapePreProcessor(Some(inputShape.toArray), outputShape.toArray) - } -} - -object Reshape { - def apply(newOutputShape: List[Int], oldInputShape: List[Int] = List()): Reshape = - new Reshape(newOutputShape, oldInputShape) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Unflatten3D.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Unflatten3D.scala deleted file mode 100644 index 4072cde61..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/layers/reshaping/Unflatten3D.scala +++ /dev/null @@ -1,56 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.reshaping - -import org.deeplearning4j.nn.conf.InputPreProcessor -import org.deeplearning4j.nn.conf.preprocessor.FeedForwardToCnnPreProcessor -import org.deeplearning4j.scalnet.layers.core.Preprocessor - -/** - * Unflattens vector into structured image-like output. Input must be a - * vector while output should have three dimensions: height (number of rows), - * width (number of columns), and number of channels. - * - * @author David Kale - */ -class Unflatten3D(newOutputShape: List[Int], nIn: Int = 0) extends Preprocessor { - if (newOutputShape.length != 3) { - throw new IllegalArgumentException("New output shape must be length 3.") - } - override val outputShape: List[Int] = newOutputShape - override val inputShape: List[Int] = List(nIn) - override val name = "Unflatten3D" - - override def reshapeInput(newIn: List[Int]): Unflatten3D = - new Unflatten3D(newOutputShape, newIn.head) - - override def compile: InputPreProcessor = { - if (PartialFunction.cond(inputShape) { case Nil => true; case 0 :: Nil => true }) { - throw new IllegalArgumentException("Input shape must be nonempty and nonzero.") - } - if (inputShape.last != outputShape.product) { - throw new IllegalStateException("Overall output shape must be equal to original input shape.") - } - new FeedForwardToCnnPreProcessor(outputShape.head, outputShape.tail.head, outputShape.last) - } -} - -object Unflatten3D { - def apply(newOutputShape: List[Int], nIn: Int = 0): Unflatten3D = - new Unflatten3D(newOutputShape, nIn) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/logging/Logging.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/logging/Logging.scala deleted file mode 100644 index 5d7f78d33..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/logging/Logging.scala +++ /dev/null @@ -1,27 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.logging - -import org.slf4j.{ Logger, LoggerFactory } - -/** - * A trait to use to extends any class where you want to provide a logger - */ -trait Logging { - protected lazy val logger: Logger = { LoggerFactory.getLogger(getClass.getName) } -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/Model.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/Model.scala deleted file mode 100644 index fb56ec54e..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/Model.scala +++ /dev/null @@ -1,206 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.models - -import org.deeplearning4j.eval.Evaluation -import org.deeplearning4j.nn.api.OptimizationAlgorithm -import org.deeplearning4j.nn.conf.layers.{ OutputLayer => JOutputLayer } -import org.deeplearning4j.nn.conf.{ NeuralNetConfiguration, Updater } -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork -import org.deeplearning4j.optimize.api.TrainingListener -import org.deeplearning4j.scalnet.layers.core.{ Node, OutputLayer } -import org.deeplearning4j.scalnet.logging.Logging -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.dataset.api.DataSet -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -import scala.collection.JavaConverters._ - -/** - * Abstract base class for neural net architectures. - * - * @author David Kale - */ -trait Model extends Logging { - - protected var layers: List[Node] = List() - protected var model: MultiLayerNetwork = _ - - def getLayers: List[Node] = layers - - /** - * Build model configuration from optimizer and seed. - * - * @param optimizer optimization algorithm to use in model - * @param seed seed to use - * @return NeuralNetConfiguration.Builder - */ - def buildModelConfig(optimizer: OptimizationAlgorithm, - updater: Updater, - miniBatch: Boolean, - biasInit: Double, - seed: Long): NeuralNetConfiguration.Builder = { - var builder: NeuralNetConfiguration.Builder = new NeuralNetConfiguration.Builder() - if (seed != 0) { - builder = builder.seed(seed) - } - builder - .optimizationAlgo(optimizer) - .updater(updater.getIUpdaterWithDefaultConfig) - .miniBatch(miniBatch) - .biasInit(biasInit) - } - - /** - * Make last layer of architecture an output layer using - * the provided loss function. - * - * @param lossFunction loss function to use - */ - def buildOutput(lossFunction: LossFunction): Unit = - layers.lastOption match { - case Some(l) if !l.isInstanceOf[OutputLayer] => - throw new IllegalArgumentException("Last layer must have Output trait") - case Some(l) if !l.asInstanceOf[OutputLayer].output.isOutput => - val last: OutputLayer = layers.last.asInstanceOf[OutputLayer].toOutputLayer(lossFunction) - layers = layers.updated(layers.length - 1, last) - case _ => - throw new IllegalArgumentException("Last layer must be an output layer with a valid loss function") - } - - /** - * Compile neural net architecture. Call immediately - * before training. - * - * @param lossFunction loss function to use - * @param optimizer optimization algorithm to use - */ - def compile(lossFunction: LossFunction, optimizer: OptimizationAlgorithm, updater: Updater): Unit - - /** - * Fit neural net to data. - * - * @param iter iterator over data set - * @param nbEpoch number of epochs to train - * @param listeners callbacks for monitoring training - */ - def fit(iter: DataSetIterator, nbEpoch: Int, listeners: List[TrainingListener]): Unit = { - model.setListeners(listeners.asJavaCollection) - for (epoch <- 0 until nbEpoch) { - logger.info("Epoch " + epoch) - model.fit(iter) - } - } - - /** - * Fit neural net to data. - * @param dataset data set - * @param nbEpoch number of epochs to train - * @param listeners callbacks for monitoring training - */ - def fit(dataset: DataSet, nbEpoch: Int, listeners: List[TrainingListener]): Unit = { - model.setListeners(listeners.asJavaCollection) - for (epoch <- 0 until nbEpoch) { - logger.info("Epoch " + epoch) - model.fit(dataset) - } - } - - /** - * Use neural net to make prediction on input x - * - * @param x input represented as INDArray - */ - def predict(x: INDArray): INDArray = model.output(x, false) - - /** - * Use neural net to make prediction on input x. - * - * @param x input represented as DataSet - */ - def predict(x: DataSet): INDArray = predict(x.getFeatures) - - /** - * Evaluate model against an iterator over data set - * - * @param iter iterator over data set - * @return Evaluation instance - */ - def evaluate(iter: DataSetIterator): Evaluation = { - val evaluator = new Evaluation(layers.last.outputShape.last) - iter.reset() - for (dataset <- iter.asScala) { - val output = predict(dataset) - evaluator.eval(dataset.getLabels, output) - } - evaluator - } - - /** - * Evaluate model against an iterator over data set - * - * @param iter iterator over data set - * @param numClasses output size - * @return Evaluation instance - */ - def evaluate(iter: DataSetIterator, numClasses: Int): Evaluation = { - val evaluator = new Evaluation(numClasses) - iter.reset() - for (dataset <- iter.asScala) { - val output = predict(dataset) - evaluator.eval(dataset.getLabels, output) - } - evaluator - } - - /** - * Evaluate model against an iterator over data set - * - * @param dataset data set - * @return Evaluation instance - */ - def evaluate(dataset: DataSet): Evaluation = { - val evaluator = new Evaluation(layers.last.outputShape.last) - val output = predict(dataset) - evaluator.eval(dataset.getLabels, output) - evaluator - } - - /** - * Evaluate model against an iterator over data set - * - * @param dataset data set - * @param numClasses output size - * @return Evaluation instance - */ - def evaluate(dataset: DataSet, numClasses: Int): Evaluation = { - val evaluator = new Evaluation(numClasses) - val output = predict(dataset) - evaluator.eval(dataset.getLabels, output) - evaluator - } - - override def toString: String = model.getLayerWiseConfigurations.toString - - def toJson: String = model.getLayerWiseConfigurations.toJson - - def toYaml: String = model.getLayerWiseConfigurations.toYaml - - def getNetwork: MultiLayerNetwork = model -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/NeuralNet.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/NeuralNet.scala deleted file mode 100644 index 2c61de611..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/NeuralNet.scala +++ /dev/null @@ -1,71 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.models - -import org.deeplearning4j.nn.api.OptimizationAlgorithm -import org.deeplearning4j.nn.conf.inputs.InputType -import org.deeplearning4j.nn.conf.{ MultiLayerConfiguration, NeuralNetConfiguration, Updater } -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node } -import org.deeplearning4j.scalnet.logging.Logging -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -/** - * Simple DL4J-style sequential neural net architecture with one input - * node and one output node for each node in computational graph. - * - * Wraps DL4J MultiLayerNetwork. Enforces DL4J model construction - * pattern: adds pre-processing layers automatically but requires - * user to specify output layer explicitly. - * - * @author David Kale - */ -class NeuralNet(inputType: Option[InputType], miniBatch: Boolean, biasInit: Double, rngSeed: Long) - extends Model - with Logging { - - def add(layer: Node): Unit = layers = layers :+ layer - - override def compile(lossFunction: LossFunction, - optimizer: OptimizationAlgorithm = OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT, - updater: Updater = Updater.SGD): Unit = { - val builder = buildModelConfig(optimizer, updater, miniBatch, biasInit, rngSeed) - buildOutput(lossFunction) - - var listBuilder: NeuralNetConfiguration.ListBuilder = builder.list() - inputType foreach (i => listBuilder.setInputType(i)) - - for ((layer, layerIndex) <- layers.zipWithIndex) { - logger.info("Layer " + layerIndex + ": " + layer.getClass.getSimpleName) - listBuilder.layer(layerIndex, layer.asInstanceOf[Layer].compile) - } - - val conf: MultiLayerConfiguration = listBuilder.build() - model = new MultiLayerNetwork(conf) - model.init() - } - -} - -object NeuralNet { - def apply(inputType: InputType = null, - miniBatch: Boolean = true, - biasInit: Double = 0.0, - rngSeed: Long = 0): NeuralNet = - new NeuralNet(Option(inputType), miniBatch, biasInit, rngSeed) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/Sequential.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/Sequential.scala deleted file mode 100644 index 3ae02a20e..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/models/Sequential.scala +++ /dev/null @@ -1,101 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.models - -import org.deeplearning4j.nn.api.OptimizationAlgorithm -import org.deeplearning4j.nn.conf.{ NeuralNetConfiguration, Updater } -import org.deeplearning4j.nn.multilayer.MultiLayerNetwork -import org.deeplearning4j.scalnet.layers.core.{ Layer, Node, Preprocessor } -import org.deeplearning4j.scalnet.logging.Logging -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -/** - * Class for keras-style simple sequential neural net architectures - * with one input node and one output node for each node - * in computational graph. - * - * Wraps DL4J MultiLayerNetwork. Enforces keras model construction - * pattern: preprocessing (reshaping) layers should be explicitly - * provided by the user, while last layer is treated implicitly as - * an output layer. - * - * @author David Kale - */ -class Sequential(miniBatch: Boolean, biasInit: Double, rngSeed: Long) extends Model with Logging { - - private var _preprocessors: Map[Int, Node] = Map() - private var _inputShape: List[Int] = List() - - def inputShape: List[Int] = _inputShape - def getPreprocessors: Map[Int, Node] = _preprocessors - - private val noLayers = inputShape.isEmpty && layers.isEmpty && _preprocessors.isEmpty - private def emptyShape(layer: Node): Boolean = - !(_preprocessors.contains(layers.length) || layers.nonEmpty) && - layer.inputShape.lengthCompare(1) == 0 && layer.inputShape.head == 0 - - def inferInputShape(layer: Node): List[Int] = - if (_preprocessors.contains(layers.length)) { - _preprocessors(layers.length).outputShape - } else layers.lastOption.map(_.outputShape).getOrElse(layer.inputShape) - - def checkShape(layer: Node): Unit = - if (emptyShape(layer)) { - throw new IllegalArgumentException("Input layer must have non-empty inputShape") - } else if (noLayers) { - _inputShape = layer.inputShape - } - - def add(layer: Node): Unit = { - val inferredInput: List[Int] = inferInputShape(layer) - checkShape(layer) - val inferredLayer = layer.reshapeInput(inferredInput) - inferredLayer match { - case _: Preprocessor => _preprocessors = _preprocessors + (layers.length -> inferredLayer) - case _ => layers = layers :+ inferredLayer - } - } - - override def compile(lossFunction: LossFunction, - optimizer: OptimizationAlgorithm = OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT, - updater: Updater = Updater.SGD): Unit = { - val builder = buildModelConfig(optimizer, updater, miniBatch, biasInit, rngSeed) - buildOutput(lossFunction) - - var listBuilder: NeuralNetConfiguration.ListBuilder = builder.list() - for ((layer, layerIndex) <- layers.zipWithIndex) { - logger.info("Layer " + layerIndex + ": " + layer.getClass.getSimpleName) - logger.info(" size: " + layer.describe()) - listBuilder.layer(layerIndex, layer.asInstanceOf[Layer].compile) - } - for ((layerIndex, preprocessor) <- _preprocessors) { - logger.info("Preprocessor " + layerIndex + ": " + preprocessor.getClass.getSimpleName) - logger.info(" size: " + preprocessor.describe()) - listBuilder.inputPreProcessor(layerIndex, preprocessor.asInstanceOf[Preprocessor].compile) - } - - model = new MultiLayerNetwork(listBuilder.build()) - model.init() - } - -} - -object Sequential { - def apply(miniBatch: Boolean = true, biasInit: Double = 0.0, rngSeed: Long = 0): Sequential = - new Sequential(miniBatch, biasInit, rngSeed) -} diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/regularizers/weightRegularizer.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/regularizers/weightRegularizer.scala deleted file mode 100644 index f9aae52d6..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/regularizers/weightRegularizer.scala +++ /dev/null @@ -1,30 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.regularizers - -/** - * Weight regularizers. - * - * @author David Kale - */ -sealed class WeightRegularizer(val l1: Double = Double.NaN, val l2: Double = Double.NaN) - -case class NoRegularizer() extends WeightRegularizer() -case class L1(l: Double = 0.01) extends WeightRegularizer(l1 = l) -case class L2(l: Double = 0.01) extends WeightRegularizer(l2 = l) -case class L1L2(override val l1: Double = 0.01, override val l2: Double = 0.01) extends WeightRegularizer diff --git a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/utils/Implicits.scala b/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/utils/Implicits.scala deleted file mode 100644 index 30525391e..000000000 --- a/contrib/attic/scalnet/src/main/scala/org/deeplearning4j/scalnet/utils/Implicits.scala +++ /dev/null @@ -1,48 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.utils - -import scala.reflect.ClassTag - -/** - * Created by maxpumperla on 17/07/17. - */ -object Implicits { - - implicit class WithAsInstanceOfOpt(obj: AnyRef) { - - /** - * Half type-safe cast. It uses erasure semantics (like Java casts). For example: - * - * `xs: List[Int]` - * - * `xs.asInstanceOfOpt[List[Int]] == xs.asInstanceOfOpt[List[Double]] == xs.asInstanceOfOpt[Seq[Int]] == Some(xs)` - * - * and - * - * `xs.asInstanceOfOpt[String] == xs.asInstanceOfOpt[Set[Int]] == None` - * - * @return None if the cast fails or the object is `null`, `Some[B]` otherwise - */ - def asInstanceOfOpt[B: ClassTag]: Option[B] = obj match { - case b: B => Some(b) - case _ => None - } - } - -} diff --git a/contrib/attic/scalnet/src/test/resources/iris.txt b/contrib/attic/scalnet/src/test/resources/iris.txt deleted file mode 100644 index 8b4511f8b..000000000 --- a/contrib/attic/scalnet/src/test/resources/iris.txt +++ /dev/null @@ -1,150 +0,0 @@ -5.1,3.5,1.4,0.2,0 -4.9,3.0,1.4,0.2,0 -4.7,3.2,1.3,0.2,0 -4.6,3.1,1.5,0.2,0 -5.0,3.6,1.4,0.2,0 -5.4,3.9,1.7,0.4,0 -4.6,3.4,1.4,0.3,0 -5.0,3.4,1.5,0.2,0 -4.4,2.9,1.4,0.2,0 -4.9,3.1,1.5,0.1,0 -5.4,3.7,1.5,0.2,0 -4.8,3.4,1.6,0.2,0 -4.8,3.0,1.4,0.1,0 -4.3,3.0,1.1,0.1,0 -5.8,4.0,1.2,0.2,0 -5.7,4.4,1.5,0.4,0 -5.4,3.9,1.3,0.4,0 -5.1,3.5,1.4,0.3,0 -5.7,3.8,1.7,0.3,0 -5.1,3.8,1.5,0.3,0 -5.4,3.4,1.7,0.2,0 -5.1,3.7,1.5,0.4,0 -4.6,3.6,1.0,0.2,0 -5.1,3.3,1.7,0.5,0 -4.8,3.4,1.9,0.2,0 -5.0,3.0,1.6,0.2,0 -5.0,3.4,1.6,0.4,0 -5.2,3.5,1.5,0.2,0 -5.2,3.4,1.4,0.2,0 -4.7,3.2,1.6,0.2,0 -4.8,3.1,1.6,0.2,0 -5.4,3.4,1.5,0.4,0 -5.2,4.1,1.5,0.1,0 -5.5,4.2,1.4,0.2,0 -4.9,3.1,1.5,0.1,0 -5.0,3.2,1.2,0.2,0 -5.5,3.5,1.3,0.2,0 -4.9,3.1,1.5,0.1,0 -4.4,3.0,1.3,0.2,0 -5.1,3.4,1.5,0.2,0 -5.0,3.5,1.3,0.3,0 -4.5,2.3,1.3,0.3,0 -4.4,3.2,1.3,0.2,0 -5.0,3.5,1.6,0.6,0 -5.1,3.8,1.9,0.4,0 -4.8,3.0,1.4,0.3,0 -5.1,3.8,1.6,0.2,0 -4.6,3.2,1.4,0.2,0 -5.3,3.7,1.5,0.2,0 -5.0,3.3,1.4,0.2,0 -7.0,3.2,4.7,1.4,1 -6.4,3.2,4.5,1.5,1 -6.9,3.1,4.9,1.5,1 -5.5,2.3,4.0,1.3,1 -6.5,2.8,4.6,1.5,1 -5.7,2.8,4.5,1.3,1 -6.3,3.3,4.7,1.6,1 -4.9,2.4,3.3,1.0,1 -6.6,2.9,4.6,1.3,1 -5.2,2.7,3.9,1.4,1 -5.0,2.0,3.5,1.0,1 -5.9,3.0,4.2,1.5,1 -6.0,2.2,4.0,1.0,1 -6.1,2.9,4.7,1.4,1 -5.6,2.9,3.6,1.3,1 -6.7,3.1,4.4,1.4,1 -5.6,3.0,4.5,1.5,1 -5.8,2.7,4.1,1.0,1 -6.2,2.2,4.5,1.5,1 -5.6,2.5,3.9,1.1,1 -5.9,3.2,4.8,1.8,1 -6.1,2.8,4.0,1.3,1 -6.3,2.5,4.9,1.5,1 -6.1,2.8,4.7,1.2,1 -6.4,2.9,4.3,1.3,1 -6.6,3.0,4.4,1.4,1 -6.8,2.8,4.8,1.4,1 -6.7,3.0,5.0,1.7,1 -6.0,2.9,4.5,1.5,1 -5.7,2.6,3.5,1.0,1 -5.5,2.4,3.8,1.1,1 -5.5,2.4,3.7,1.0,1 -5.8,2.7,3.9,1.2,1 -6.0,2.7,5.1,1.6,1 -5.4,3.0,4.5,1.5,1 -6.0,3.4,4.5,1.6,1 -6.7,3.1,4.7,1.5,1 -6.3,2.3,4.4,1.3,1 -5.6,3.0,4.1,1.3,1 -5.5,2.5,4.0,1.3,1 -5.5,2.6,4.4,1.2,1 -6.1,3.0,4.6,1.4,1 -5.8,2.6,4.0,1.2,1 -5.0,2.3,3.3,1.0,1 -5.6,2.7,4.2,1.3,1 -5.7,3.0,4.2,1.2,1 -5.7,2.9,4.2,1.3,1 -6.2,2.9,4.3,1.3,1 -5.1,2.5,3.0,1.1,1 -5.7,2.8,4.1,1.3,1 -6.3,3.3,6.0,2.5,2 -5.8,2.7,5.1,1.9,2 -7.1,3.0,5.9,2.1,2 -6.3,2.9,5.6,1.8,2 -6.5,3.0,5.8,2.2,2 -7.6,3.0,6.6,2.1,2 -4.9,2.5,4.5,1.7,2 -7.3,2.9,6.3,1.8,2 -6.7,2.5,5.8,1.8,2 -7.2,3.6,6.1,2.5,2 -6.5,3.2,5.1,2.0,2 -6.4,2.7,5.3,1.9,2 -6.8,3.0,5.5,2.1,2 -5.7,2.5,5.0,2.0,2 -5.8,2.8,5.1,2.4,2 -6.4,3.2,5.3,2.3,2 -6.5,3.0,5.5,1.8,2 -7.7,3.8,6.7,2.2,2 -7.7,2.6,6.9,2.3,2 -6.0,2.2,5.0,1.5,2 -6.9,3.2,5.7,2.3,2 -5.6,2.8,4.9,2.0,2 -7.7,2.8,6.7,2.0,2 -6.3,2.7,4.9,1.8,2 -6.7,3.3,5.7,2.1,2 -7.2,3.2,6.0,1.8,2 -6.2,2.8,4.8,1.8,2 -6.1,3.0,4.9,1.8,2 -6.4,2.8,5.6,2.1,2 -7.2,3.0,5.8,1.6,2 -7.4,2.8,6.1,1.9,2 -7.9,3.8,6.4,2.0,2 -6.4,2.8,5.6,2.2,2 -6.3,2.8,5.1,1.5,2 -6.1,2.6,5.6,1.4,2 -7.7,3.0,6.1,2.3,2 -6.3,3.4,5.6,2.4,2 -6.4,3.1,5.5,1.8,2 -6.0,3.0,4.8,1.8,2 -6.9,3.1,5.4,2.1,2 -6.7,3.1,5.6,2.4,2 -6.9,3.1,5.1,2.3,2 -5.8,2.7,5.1,1.9,2 -6.8,3.2,5.9,2.3,2 -6.7,3.3,5.7,2.5,2 -6.7,3.0,5.2,2.3,2 -6.3,2.5,5.0,1.9,2 -6.5,3.0,5.2,2.0,2 -6.2,3.4,5.4,2.3,2 -5.9,3.0,5.1,1.8,2 diff --git a/contrib/attic/scalnet/src/test/resources/logback-test.xml b/contrib/attic/scalnet/src/test/resources/logback-test.xml deleted file mode 100644 index 36b0cc70f..000000000 --- a/contrib/attic/scalnet/src/test/resources/logback-test.xml +++ /dev/null @@ -1,31 +0,0 @@ - - - - - - %d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n - - - - - - - diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/convolution/LeNetMnistExample.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/convolution/LeNetMnistExample.scala deleted file mode 100644 index e5704a55f..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/convolution/LeNetMnistExample.scala +++ /dev/null @@ -1,74 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.examples.dl4j.convolution - -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator -import org.deeplearning4j.nn.conf.inputs.InputType -import org.deeplearning4j.optimize.listeners.ScoreIterationListener -import org.deeplearning4j.scalnet.layers.convolutional.Convolution2D -import org.deeplearning4j.scalnet.layers.core.Dense -import org.deeplearning4j.scalnet.layers.pooling.MaxPooling2D -import org.deeplearning4j.scalnet.logging.Logging -import org.deeplearning4j.scalnet.models.NeuralNet -import org.deeplearning4j.scalnet.regularizers.L2 -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -/** - * Simple LeNet convolutional neural net for MNIST, using - * DL4J-style NeuralNet model construction pattern. - * - * @author David Kale - */ -object LeNetMnistExample extends App with Logging { - - val height: Int = 28 - val width: Int = 28 - val channels: Int = 1 - val nClasses: Int = 10 - - val batchSize: Int = 64 - val epochs: Int = 10 - val weightDecay: Double = 0.0005 - val seed: Int = 12345 - val scoreFrequency = 100 - - val mnistTrain: DataSetIterator = new MnistDataSetIterator(batchSize, true, seed) - val mnistTest: DataSetIterator = new MnistDataSetIterator(batchSize, false, seed) - - logger.info("Build model...") - val model = NeuralNet(inputType = InputType.convolutionalFlat(height, width, channels), rngSeed = seed) - - model.add(Convolution2D(20, List(5, 5), channels, regularizer = L2(weightDecay), activation = Activation.RELU)) - model.add(MaxPooling2D(List(2, 2), List(2, 2))) - - model.add(Convolution2D(50, List(5, 5), regularizer = L2(weightDecay), activation = Activation.RELU)) - model.add(MaxPooling2D(List(2, 2), List(2, 2))) - - model.add(Dense(512, regularizer = L2(weightDecay), activation = Activation.RELU)) - model.add(Dense(nClasses, activation = Activation.SOFTMAX)) - model.compile(LossFunction.NEGATIVELOGLIKELIHOOD) - - logger.info("Train model...") - model.fit(mnistTrain, epochs, List(new ScoreIterationListener(scoreFrequency))) - - logger.info("Evaluate model...") - logger.info(s"Train accuracy = ${model.evaluate(mnistTrain).accuracy}") - logger.info(s"Test accuracy = ${model.evaluate(mnistTest).accuracy}") -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/feedforward/IrisCSVExample.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/feedforward/IrisCSVExample.scala deleted file mode 100644 index 097fc243f..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/feedforward/IrisCSVExample.scala +++ /dev/null @@ -1,79 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.examples.dl4j.feedforward - -import java.util - -import org.datavec.api.records.reader.impl.csv.CSVRecordReader -import org.datavec.api.split.FileSplit -import org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator -import org.deeplearning4j.datasets.iterator.impl.ListDataSetIterator -import org.deeplearning4j.nn.conf.Updater -import org.deeplearning4j.optimize.listeners.ScoreIterationListener -import org.deeplearning4j.scalnet.layers.core.Dense -import org.deeplearning4j.scalnet.logging.Logging -import org.deeplearning4j.scalnet.models.NeuralNet -import org.nd4j.common.io.ClassPathResource -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator -import org.nd4j.linalg.dataset.{ DataSet, SplitTestAndTrain } -import org.nd4j.linalg.learning.config.Adam -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -object IrisCSVExample extends App with Logging { - - val numLinesToSkip = 0 - val delimiter = ',' - val labelIndex = 4 - val nClasses = 3 - val batchSize = 150 - val hiddenSize = 128 - val inputSize = 4 - val outputSize = 3 - val epochs = 20 - val scoreFrequency = 5 - val seed = 1234 - - logger.info("Reading data set...") - val recordReader = new CSVRecordReader(numLinesToSkip, delimiter) - recordReader.initialize(new FileSplit(new ClassPathResource("iris.txt").getFile)) - val iterator: DataSetIterator = new RecordReaderDataSetIterator(recordReader, batchSize, labelIndex, nClasses) - - logger.info("Prepare data set for training...") - val next: DataSet = iterator.next() - next.shuffle() - val testAndTrain: SplitTestAndTrain = next.splitTestAndTrain(0.75) - val test_data: DataSet = testAndTrain.getTest - val training_ : util.List[DataSet] = testAndTrain.getTrain.asList() - val training_data = new ListDataSetIterator(training_, training_.size) - - logger.info("Build model...") - val model: NeuralNet = NeuralNet(rngSeed = seed) - model.add(Dense(nIn = inputSize, nOut = hiddenSize, activation = Activation.RELU)) - model.add(Dense(nOut = hiddenSize, activation = Activation.RELU)) - model.add(Dense(nOut = hiddenSize, activation = Activation.RELU)) - model.add(Dense(outputSize, activation = Activation.SOFTMAX)) - model.compile(LossFunction.MCXENT, updater = Updater.ADAM) - - logger.info("Train model...") - model.fit(training_data, epochs, List(new ScoreIterationListener(scoreFrequency))) - - logger.info("Evaluate model...") - logger.info(s"Train accuracy = ${model.evaluate(training_data).accuracy}") - logger.info(s"Test accuracy = ${model.evaluate(test_data).accuracy}") -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/feedforward/MLPMnistTwoLayerExample.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/feedforward/MLPMnistTwoLayerExample.scala deleted file mode 100644 index eb85ef5c9..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/feedforward/MLPMnistTwoLayerExample.scala +++ /dev/null @@ -1,66 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.examples.dl4j.feedforward - -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator -import org.deeplearning4j.optimize.listeners.ScoreIterationListener -import org.deeplearning4j.scalnet.layers.core.Dense -import org.deeplearning4j.scalnet.logging.Logging -import org.deeplearning4j.scalnet.models.NeuralNet -import org.deeplearning4j.scalnet.regularizers.L2 -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -/** - * Two-layer MLP for MNIST using DL4J-style NeuralNet - * model construction pattern. - * - * @author David Kale - */ -object MLPMnistTwoLayerExample extends App with Logging { - - val height: Int = 28 - val width: Int = 28 - val nClasses: Int = 10 - val batchSize: Int = 64 - val hiddenSize = 512 - val seed: Int = 123 - val epochs: Int = 15 - val learningRate: Double = 0.0015 - val decay: Double = 0.005 - val scoreFrequency = 1000 - - val mnistTrain: DataSetIterator = new MnistDataSetIterator(batchSize, true, seed) - val mnistTest: DataSetIterator = new MnistDataSetIterator(batchSize, false, seed) - - logger.info("Build model...") - val model: NeuralNet = NeuralNet(rngSeed = seed) - - model.add(Dense(hiddenSize, height * width, activation = Activation.RELU, regularizer = L2(learningRate * decay))) - model.add(Dense(hiddenSize, activation = Activation.RELU, regularizer = L2(learningRate * decay))) - model.add(Dense(nClasses, activation = Activation.SOFTMAX, regularizer = L2(learningRate * decay))) - model.compile(LossFunction.NEGATIVELOGLIKELIHOOD) - - logger.info("Train model...") - model.fit(mnistTrain, epochs, List(new ScoreIterationListener(scoreFrequency))) - - logger.info("Evaluate model...") - logger.info(s"Train accuracy = ${model.evaluate(mnistTrain).accuracy}") - logger.info(s"Test accuracy = ${model.evaluate(mnistTest).accuracy}") -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/BasicRNNExample.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/BasicRNNExample.scala deleted file mode 100644 index e4db376a5..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/BasicRNNExample.scala +++ /dev/null @@ -1,71 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.examples.dl4j.recurrent - -import org.deeplearning4j.nn.api.OptimizationAlgorithm -import org.deeplearning4j.nn.conf.Updater -import org.deeplearning4j.scalnet.layers.recurrent.{ LSTM, RnnOutputLayer } -import org.deeplearning4j.scalnet.logging.Logging -import org.deeplearning4j.scalnet.models.NeuralNet -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.dataset.DataSet -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -import scala.util.Random - -object BasicRNNExample extends App with Logging { - - // define a sentence to learn. - // Add a special character at the beginning so the RNN learns the complete string and ends with the marker. - val learningString = "*Der Cottbuser Postkutscher putzt den Cottbuser Postkutschkasten.".toVector - val learningChars = learningString.distinct - val hiddenSize = 64 - val epochs = 20 - val seed = 1234 - val rand = new Random(seed) - - val input = Nd4j.zeros(1, learningChars.length, learningString.length) - val labels = Nd4j.zeros(1, learningChars.length, learningString.length) - - val trainingData: DataSet = { - learningString.zipWithIndex.foreach { - case (currentChar, index) => - val nextChar = if (index + 1 > learningString.indices.max) learningString(0) else learningString(index + 1) - input.putScalar(Array[Int](0, learningChars.indexOf(currentChar), index), 1) - labels.putScalar(Array[Int](0, learningChars.indexOf(nextChar), index), 1) - } - new DataSet(input, labels) - } - - logger.info("Build model...") - val model: NeuralNet = { - val model: NeuralNet = NeuralNet(rngSeed = seed, miniBatch = false) - model.add(LSTM(learningChars.length, hiddenSize, Activation.TANH)) - model.add(LSTM(hiddenSize, hiddenSize, Activation.TANH)) - model.add(RnnOutputLayer(hiddenSize, learningChars.length, Activation.SOFTMAX)) - model.compile(LossFunction.MCXENT, OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT, Updater.RMSPROP) - model - } - - val rnn = model.getNetwork - - (0 until epochs).foreach { e => - rnn.fit(trainingData) - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/RNNEmbeddingExample.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/RNNEmbeddingExample.scala deleted file mode 100644 index 8eb7eff88..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/RNNEmbeddingExample.scala +++ /dev/null @@ -1,62 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.examples.dl4j.recurrent - -import org.deeplearning4j.nn.conf.inputs.InputType -import org.deeplearning4j.optimize.listeners.ScoreIterationListener -import org.deeplearning4j.scalnet.layers.embeddings.EmbeddingLayer -import org.deeplearning4j.scalnet.layers.recurrent.{ GravesLSTM, RnnOutputLayer } -import org.deeplearning4j.scalnet.models.NeuralNet -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.dataset.DataSet -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -import scala.util.Random - -object RNNEmbeddingExample extends App { - - val nClassesIn = 10 - val batchSize = 3 - val timeSeriesLength = 8 - val inEmbedding = Nd4j.create(batchSize, 1, timeSeriesLength) - val outLabels = Nd4j.create(batchSize, 4, timeSeriesLength) - val seed = 12345 - val rand = new Random(seed) - - val timeSeries: DataSet = { - for (i <- 0 until batchSize; j <- 0 until timeSeriesLength) { - val classIdx = rand.nextInt(nClassesIn) - inEmbedding.putScalar(Array[Int](i, 0, j), classIdx) - val labelIdx = rand.nextInt(batchSize + 1) - outLabels.putScalar(Array[Int](i, labelIdx, j), 1.0) - } - new DataSet(inEmbedding, outLabels) - } - - val model: NeuralNet = { - val model: NeuralNet = NeuralNet(inputType = InputType.recurrent(3, 8), rngSeed = seed) - model.add(EmbeddingLayer(nClassesIn, 5)) - model.add(GravesLSTM(5, 7, Activation.SOFTSIGN)) - model.add(RnnOutputLayer(7, 4, Activation.SOFTMAX)) - model.compile(LossFunction.MCXENT) - model - } - - model.fit(timeSeries, 1, List(new ScoreIterationListener(1))) -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/SequenceClassification.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/SequenceClassification.scala deleted file mode 100644 index ba0ab8a8b..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/dl4j/recurrent/SequenceClassification.scala +++ /dev/null @@ -1,58 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.examples.dl4j.recurrent - -import org.deeplearning4j.eval.RegressionEvaluation -import org.deeplearning4j.nn.conf.Updater -import org.deeplearning4j.optimize.listeners.ScoreIterationListener -import org.deeplearning4j.scalnet.layers.recurrent.{ Bidirectional, LSTM, RnnOutputLayer } -import org.deeplearning4j.scalnet.logging.Logging -import org.deeplearning4j.scalnet.models.NeuralNet -import org.deeplearning4j.scalnet.utils.SequenceGenerator -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -object SequenceClassification extends App with Logging { - - val timesteps = 10 - val hiddenSize = 32 - val epochs = 500 - val testSize = 100 - val scoreFrequency = 10 - - def generateDataset = SequenceGenerator.generate(timesteps) - - logger.info("Build model...") - val model: NeuralNet = NeuralNet() - model.add(Bidirectional(LSTM(timesteps, hiddenSize), Bidirectional.ADD)) - model.add(RnnOutputLayer(hiddenSize, timesteps, Activation.SIGMOID)) - model.compile(LossFunction.MEAN_ABSOLUTE_ERROR, updater = Updater.ADAM) - - logger.info("Train model...") - model.fit(generateDataset, epochs, List(new ScoreIterationListener(scoreFrequency))) - - logger.info("Evaluate model...") - val evaluator = new RegressionEvaluation(timesteps) - for (_ <- 0 until testSize) { - val testData = generateDataset - val trueLabels = testData.getLabels - val predicted = model.predict(testData.getFeatures) - evaluator.eval(trueLabels, predicted) - } - logger.info(s"MAE score: ${evaluator.averageMeanAbsoluteError}") -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/keras/convolution/LeNetMnistExample.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/keras/convolution/LeNetMnistExample.scala deleted file mode 100644 index 06c477c5e..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/keras/convolution/LeNetMnistExample.scala +++ /dev/null @@ -1,76 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.examples.keras.convolution - -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator -import org.deeplearning4j.optimize.listeners.ScoreIterationListener -import org.deeplearning4j.scalnet.layers.convolutional.Convolution2D -import org.deeplearning4j.scalnet.layers.core.Dense -import org.deeplearning4j.scalnet.layers.pooling.MaxPooling2D -import org.deeplearning4j.scalnet.layers.reshaping.{ Flatten3D, Unflatten3D } -import org.deeplearning4j.scalnet.logging.Logging -import org.deeplearning4j.scalnet.models.Sequential -import org.deeplearning4j.scalnet.regularizers.L2 -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -/** - * Simple LeNet convolutional neural net for MNIST, using - * keras-style Sequential model construction pattern. - * - * @author David Kale - */ -object LeNetMnistExample extends App with Logging { - - val height: Int = 28 - val width: Int = 28 - val channels: Int = 1 - val nClasses: Int = 10 - - val batchSize: Int = 64 - val epochs: Int = 10 - val weightDecay: Double = 0.0005 - val seed: Int = 12345 - val scoreFrequency = 100 - - val mnistTrain: DataSetIterator = new MnistDataSetIterator(batchSize, true, seed) - val mnistTest: DataSetIterator = new MnistDataSetIterator(batchSize, false, seed) - - logger.info("Build model...") - val model: Sequential = Sequential(rngSeed = seed) - - model.add(Unflatten3D(List(height, width, channels), nIn = height * width)) - model.add(Convolution2D(20, List(5, 5), channels, regularizer = L2(weightDecay), activation = Activation.RELU)) - model.add(MaxPooling2D(List(2, 2), List(2, 2))) - - model.add(Convolution2D(50, List(5, 5), regularizer = L2(weightDecay), activation = Activation.RELU)) - model.add(MaxPooling2D(List(2, 2), List(2, 2))) - model.add(Flatten3D()) - - model.add(Dense(512, regularizer = L2(weightDecay), activation = Activation.RELU)) - model.add(Dense(nClasses, activation = Activation.SOFTMAX)) - model.compile(LossFunction.NEGATIVELOGLIKELIHOOD) - - logger.info("Train model...") - model.fit(mnistTrain, epochs, List(new ScoreIterationListener(scoreFrequency))) - - logger.info("Evaluate model...") - logger.info(s"Train accuracy = ${model.evaluate(mnistTrain).accuracy}") - logger.info(s"Test accuracy = ${model.evaluate(mnistTest).accuracy}") -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/keras/feedforward/MLPMnistTwoLayerExample.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/keras/feedforward/MLPMnistTwoLayerExample.scala deleted file mode 100644 index 5244ec676..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/examples/keras/feedforward/MLPMnistTwoLayerExample.scala +++ /dev/null @@ -1,66 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.examples.keras.feedforward - -import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator -import org.deeplearning4j.optimize.listeners.ScoreIterationListener -import org.deeplearning4j.scalnet.layers.core.Dense -import org.deeplearning4j.scalnet.logging.Logging -import org.deeplearning4j.scalnet.models.Sequential -import org.deeplearning4j.scalnet.regularizers.L2 -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.dataset.api.iterator.DataSetIterator -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction - -/** - * Two-layer MLP for MNIST using keras-style Sequential - * model construction pattern. - * - * @author David Kale - */ -object MLPMnistTwoLayerExample extends App with Logging { - - val height: Int = 28 - val width: Int = 28 - val nClasses: Int = 10 - val batchSize: Int = 64 - val hiddenSize = 512 - val seed: Int = 123 - val epochs: Int = 15 - val learningRate: Double = 0.0015 - val decay: Double = 0.005 - val scoreFrequency = 1000 - - val mnistTrain: DataSetIterator = new MnistDataSetIterator(batchSize, true, seed) - val mnistTest: DataSetIterator = new MnistDataSetIterator(batchSize, false, seed) - - logger.info("Build model...") - val model: Sequential = Sequential(rngSeed = seed) - - model.add(Dense(hiddenSize, height * width, activation = Activation.RELU, regularizer = L2(learningRate * decay))) - model.add(Dense(hiddenSize, activation = Activation.RELU, regularizer = L2(learningRate * decay))) - model.add(Dense(nClasses, activation = Activation.SOFTMAX, regularizer = L2(learningRate * decay))) - model.compile(LossFunction.NEGATIVELOGLIKELIHOOD) - - logger.info("Train model...") - model.fit(mnistTrain, epochs, List(new ScoreIterationListener(scoreFrequency))) - - logger.info("Evaluate model...") - logger.info(s"Train accuracy = ${model.evaluate(mnistTrain).accuracy}") - logger.info(s"Test accuracy = ${model.evaluate(mnistTest).accuracy}") -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/it/DL4Test.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/it/DL4Test.scala deleted file mode 100644 index 3c181b8a3..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/it/DL4Test.scala +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.it - -import org.deeplearning4j.scalnet.examples.dl4j.feedforward.IrisCSVExample -import org.deeplearning4j.scalnet.examples.dl4j.recurrent.{ BasicRNNExample, RNNEmbeddingExample } -import org.scalatest.{ Matchers, WordSpec } - -import scala.util.Try - -/** - * A suite of basic, short and non cpu-heavy integration tests. Only test if example is run without errors. - */ -class DL4Test extends WordSpec with Matchers { - - "DL4J integration tests" should { - - "ensure that Iris example run without errors" in { - val runExample = Try(IrisCSVExample.main(Array(""))) - assert(runExample.isSuccess) - } - - "ensure that basic RNN example run without errors" in { - val runExample = Try(BasicRNNExample.main(Array(""))) - assert(runExample.isSuccess) - } - - "ensure that RNN embedding example run without errors" in { - val runExample = Try(RNNEmbeddingExample.main(Array(""))) - assert(runExample.isSuccess) - } - - } - -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution2DTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution2DTest.scala deleted file mode 100644 index b9f5bd103..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/convolutional/Convolution2DTest.scala +++ /dev/null @@ -1,47 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.convolutional - -import org.deeplearning4j.nn.conf.layers.ConvolutionLayer -import org.scalatest.FunSpec - -/** - * Created by maxpumperla on 19/07/17. - */ -class Convolution2DTest extends FunSpec { - - describe("A 2D convolutional layer with 20 filters and kernel size (5,5)") { - val nFilter = 20 - val kernelSize = List(5, 5) - val conv = Convolution2D(nFilter, kernelSize) - it("should have inputShape List(0)") { - assert(conv.inputShape == List(0)) - } - it("should have an outputShape of List(20)") { - assert(conv.outputShape == List(nFilter)) - } - it("should accept a new input shape when provided") { - val reshapedConv = conv.reshapeInput(List(1, 2, 3)) - assert(reshapedConv.inputShape == List(1, 2, 3)) - } - it("should become a DL4J convolution layer when compiled") { - val compiledConv = conv.compile - assert(compiledConv.isInstanceOf[ConvolutionLayer]) - } - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/core/DenseTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/core/DenseTest.scala deleted file mode 100644 index 05201246c..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/core/DenseTest.scala +++ /dev/null @@ -1,68 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.core - -import org.deeplearning4j.nn.conf.layers.{ DenseLayer, OutputLayer => JOutputLayer } -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction -import org.scalatest.{ Matchers, WordSpec } - -class DenseTest extends WordSpec with Matchers { - - "A Dense layer" should { - - "have an input layer of shape (0, 100)" in { - val DenseLayer = Dense(100) - DenseLayer.inputShape shouldBe List(0) - } - - "have an ouput layer of shape (0, 100)" in { - val DenseLayer = Dense(100) - DenseLayer.outputShape shouldBe List(100) - } - - "compile to a DL4J Dense" in { - val DenseLayer = Dense(100) - val compiledLayer = DenseLayer.compile - compiledLayer.isInstanceOf[DenseLayer] shouldBe true - } - - "does not become an output layer when compiled without proper loss" in { - val DenseLayer = Dense(100) - val compiledLayer = DenseLayer.compile - compiledLayer.isInstanceOf[JOutputLayer] shouldBe false - } - - "does not become an output layer when converted to ouput layer without proper loss" in { - val DenseLayer = Dense(100) - val compiledLayer = DenseLayer.toOutputLayer(null) - compiledLayer.isInstanceOf[JOutputLayer] shouldBe false - } - - "become an output layer when compiled with proper loss" in { - val DenseLayer = Dense(100, lossFunction = Option(LossFunction.NEGATIVELOGLIKELIHOOD)) - val compiledLayer = DenseLayer.compile - compiledLayer.isInstanceOf[JOutputLayer] shouldBe true - } - - "become an output layer when converted to ouput layer with proper loss" in { - val DenseLayer = Dense(100) - val compiledLayer = DenseLayer.toOutputLayer(LossFunction.NEGATIVELOGLIKELIHOOD) - compiledLayer.isInstanceOf[OutputLayer] shouldBe true - } - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/embeddings/EmbeddingLayerTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/embeddings/EmbeddingLayerTest.scala deleted file mode 100644 index 8f95af0ca..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/embeddings/EmbeddingLayerTest.scala +++ /dev/null @@ -1,44 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.embeddings - -import org.scalatest.{ Matchers, WordSpec } - -class EmbeddingLayerTest extends WordSpec with Matchers { - - "An embedding layer" should { - - "have an input layer of shape (10, 100)" in { - val embeddingLayer = EmbeddingLayer(10, 100) - embeddingLayer.inputShape shouldBe List(10, 100) - } - - "have an ouput layer of shape (10, 100)" in { - val embeddingLayer = EmbeddingLayer(10, 100) - embeddingLayer.outputShape shouldBe List(100, 10) - } - - "compile to a DL4J EmbeddingLayer" in { - val embeddingLayer = EmbeddingLayer(10, 100) - val compiledLayer = embeddingLayer.compile - compiledLayer.isInstanceOf[org.deeplearning4j.nn.conf.layers.EmbeddingLayer] shouldBe true - } - - } - -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling2DTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling2DTest.scala deleted file mode 100644 index e14e7750f..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/pooling/AvgPooling2DTest.scala +++ /dev/null @@ -1,46 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.SubsamplingLayer -import org.scalatest.FunSpec - -/** - * Created by maxpumperla on 19/07/17. - */ -class AvgPooling2DTest extends FunSpec { - - describe("A 2D averaging pooling layer with kernel size (5,5)") { - val kernelSize = List(5, 5) - val avgPool = AvgPooling2D(kernelSize) - it("should have inputShape List(0)") { - assert(avgPool.inputShape == List(0)) - } - it("should have empty outputShape") { - assert(avgPool.outputShape == List()) - } - it("should accept a new input shape when provided") { - val reshapedPool = avgPool.reshapeInput(List(1, 2, 3)) - assert(reshapedPool.inputShape == List(1, 2, 3)) - } - it("should become a DL4J pooling layer when compiled") { - val compiledPool = avgPool.compile - assert(compiledPool.isInstanceOf[SubsamplingLayer]) - } - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling2DTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling2DTest.scala deleted file mode 100644 index dc9e89c53..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/pooling/MaxPooling2DTest.scala +++ /dev/null @@ -1,46 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.pooling - -import org.deeplearning4j.nn.conf.layers.SubsamplingLayer -import org.scalatest.FunSpec - -/** - * Created by maxpumperla on 19/07/17. - */ -class MaxPooling2DTest extends FunSpec { - - describe("A 2D max pooling layer with kernel size (5,5)") { - val kernelSize = List(5, 5) - val maxPool = MaxPooling2D(kernelSize) - it("should have inputShape List(0)") { - assert(maxPool.inputShape == List(0)) - } - it("should have empty outputShape") { - assert(maxPool.outputShape == List()) - } - it("should accept a new input shape when provided") { - val reshapedPool = maxPool.reshapeInput(List(1, 2, 3)) - assert(reshapedPool.inputShape == List(1, 2, 3)) - } - it("should become a DL4J pooling layer when compiled") { - val compiledPool = maxPool.compile - assert(compiledPool.isInstanceOf[SubsamplingLayer]) - } - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/BidirectionalTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/BidirectionalTest.scala deleted file mode 100644 index f4bf2c471..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/BidirectionalTest.scala +++ /dev/null @@ -1,39 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.recurrent - -import org.scalatest.{ Matchers, WordSpec } - -class BidirectionalTest extends WordSpec with Matchers { - - "A Bidirectional wrapper layer" should { - - "compile to a DL4J Bidirectional wrapper layer with a LSTM" in { - val bidirectionalLSTM = Bidirectional(LSTM(10, 100)) - val compiledLayer = bidirectionalLSTM.compile - compiledLayer.isInstanceOf[org.deeplearning4j.nn.conf.layers.recurrent.Bidirectional] shouldBe true - } - - "compile to a DL4J Bidirectional wrapper layer with a GravesLSTM" in { - val bidirectionalLSTM = Bidirectional(GravesLSTM(10, 100)) - val compiledLayer = bidirectionalLSTM.compile - compiledLayer.isInstanceOf[org.deeplearning4j.nn.conf.layers.recurrent.Bidirectional] shouldBe true - } - - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/GravesLSTMTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/GravesLSTMTest.scala deleted file mode 100644 index 4ec351ee4..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/GravesLSTMTest.scala +++ /dev/null @@ -1,43 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.recurrent - -import org.scalatest.{ Matchers, WordSpec } - -class GravesLSTMTest extends WordSpec with Matchers { - - "A Graves LSTM layer" should { - - "have an input layer of shape (10, 100)" in { - val gravesLSTMLayer = GravesLSTM(10, 100) - gravesLSTMLayer.inputShape shouldBe List(10, 100) - } - - "have an ouput layer of shape (10, 100)" in { - val gravesLSTMLayer = GravesLSTM(10, 100) - gravesLSTMLayer.outputShape shouldBe List(100, 10) - } - - "compile to a DL4J GravesLSTM" in { - val gravesLSTMLayer = GravesLSTM(10, 100) - val compiledLayer = gravesLSTMLayer.compile - compiledLayer.isInstanceOf[org.deeplearning4j.nn.conf.layers.GravesLSTM] shouldBe true - } - - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/LSTMTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/LSTMTest.scala deleted file mode 100644 index ed0d14a85..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/LSTMTest.scala +++ /dev/null @@ -1,43 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.recurrent - -import org.scalatest.{ Matchers, WordSpec } - -class LSTMTest extends WordSpec with Matchers { - - "A LSTM layer" should { - - "have an input layer of shape (10, 100)" in { - val LSTMLayer = LSTM(10, 100) - LSTMLayer.inputShape shouldBe List(10, 100) - } - - "have an ouput layer of shape (10, 100)" in { - val LSTMLayer = LSTM(10, 100) - LSTMLayer.outputShape shouldBe List(100, 10) - } - - "compile to a DL4J LSTM" in { - val LSTMLayer = LSTM(10, 100) - val compiledLayer = LSTMLayer.compile - compiledLayer.isInstanceOf[org.deeplearning4j.nn.conf.layers.LSTM] shouldBe true - } - - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/RnnOutputLayerTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/RnnOutputLayerTest.scala deleted file mode 100644 index f93bccc4c..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/recurrent/RnnOutputLayerTest.scala +++ /dev/null @@ -1,70 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.recurrent - -import org.deeplearning4j.nn.conf.layers.{ OutputLayer => JOutputLayer } -import org.deeplearning4j.nn.layers.BaseOutputLayer -import org.deeplearning4j.scalnet.layers.core.OutputLayer -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction -import org.scalatest.{ Matchers, WordSpec } - -class RnnOutputLayerTest extends WordSpec with Matchers { - - "A RnnOutput layer" should { - - "have an input layer of shape (10, 100)" in { - val rnnOutputLayer = RnnOutputLayer(10, 100, Activation.SOFTMAX) - rnnOutputLayer.inputShape shouldBe List(10, 100) - } - - "have an ouput layer of shape (10, 100)" in { - val rnnOutputLayer = RnnOutputLayer(10, 100, Activation.SOFTMAX) - rnnOutputLayer.outputShape shouldBe List(100, 10) - } - - "compile to a DL4J RnnOutputLayer without loss" in { - val rnnOutputLayer = RnnOutputLayer(10, 100, Activation.SOFTMAX) - val compiledLayer = rnnOutputLayer.compile - compiledLayer.isInstanceOf[org.deeplearning4j.nn.conf.layers.RnnOutputLayer] shouldBe true - } - - "compile to a DL4J RnnOutputLayer with loss" in { - val rnnOutputLayer = RnnOutputLayer(10, 100, Activation.SOFTMAX, Option(LossFunction.MCXENT)) - val compiledLayer = rnnOutputLayer.compile - compiledLayer.isInstanceOf[org.deeplearning4j.nn.conf.layers.RnnOutputLayer] shouldBe true - } - - "does not become an output layer when instanciated without proper loss" in { - val rnnOutputLayer = RnnOutputLayer(10, 100, Activation.SOFTMAX) - rnnOutputLayer.output.isOutput shouldBe false - } - - "become an output layer when instanciated with proper loss" in { - val rnnOutputLayer = RnnOutputLayer(10, 100, Activation.SOFTMAX, Option(LossFunction.MCXENT)) - rnnOutputLayer.output.isOutput shouldBe true - } - - "become an output layer when converted to ouput layer with proper loss" in { - val rnnOutputLayer = RnnOutputLayer(10, 100, Activation.SOFTMAX) - val compiledLayer = rnnOutputLayer.toOutputLayer(LossFunction.MCXENT) - compiledLayer.isInstanceOf[OutputLayer] shouldBe true - } - - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/Flatten3DTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/Flatten3DTest.scala deleted file mode 100644 index b1c283cd0..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/Flatten3DTest.scala +++ /dev/null @@ -1,52 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.reshaping - -import org.deeplearning4j.nn.conf.InputPreProcessor -import org.scalatest.FunSpec - -/** - * Created by maxpumperla on 19/07/17. - */ -class Flatten3DTest extends FunSpec { - - describe("A 3D flatten layer with output dim 20") { - val outShape = List(20) - val flatten = Flatten3D(outShape) - it("should have output shape as input shape") { - assert(flatten.inputShape == outShape) - } - it("should have outputShape as provided") { - assert(flatten.outputShape == outShape) - } - it("should accept a new input shape when provided") { - val reshapedFlatten = flatten.reshapeInput(List(10, 2, 10)) - assert(reshapedFlatten.inputShape == List(10, 2, 10)) - } - it("should not compile when input shape is not 3D") { - assertThrows[java.lang.IllegalArgumentException] { - flatten.compile - } - } - it("should become a DL4J InputPreProcessor when compiled") { - val reshapedFlatten = flatten.reshapeInput(List(10, 2, 10)) - val compiledFlatten = reshapedFlatten.compile - assert(compiledFlatten.isInstanceOf[InputPreProcessor]) - } - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/ReshapeTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/ReshapeTest.scala deleted file mode 100644 index ed50dc571..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/ReshapeTest.scala +++ /dev/null @@ -1,47 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.reshaping - -import org.deeplearning4j.nn.conf.InputPreProcessor -import org.scalatest.FunSpec - -/** - * Created by maxpumperla on 19/07/17. - */ -class ReshapeTest extends FunSpec { - - describe("A reshape layer with in and out shapes") { - val inShape = List(20, 10) - val outShape = List(10, 20) - val reshape = Reshape(outShape, inShape) - it("should have inputShape as provided") { - assert(reshape.inputShape == inShape) - } - it("should have outputShape as provided") { - assert(reshape.outputShape == outShape) - } - it("should accept a new input shape when provided") { - val reshaped = reshape.reshapeInput(List(10, 2, 10)) - assert(reshaped.inputShape == List(10, 2, 10)) - } - it("should become a DL4J InputPreProcessor when compiled") { - val compiledReshape = reshape.compile - assert(compiledReshape.isInstanceOf[InputPreProcessor]) - } - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/Unflatten3DTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/Unflatten3DTest.scala deleted file mode 100644 index c2da8f6dc..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/layers/reshaping/Unflatten3DTest.scala +++ /dev/null @@ -1,52 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.layers.reshaping - -import org.deeplearning4j.nn.conf.InputPreProcessor -import org.scalatest.FunSpec - -/** - * Created by maxpumperla on 19/07/17. - */ -class Unflatten3DTest extends FunSpec { - - describe("A 3D unflatten layer with output dimensions (10, 20, 30)") { - val outShape = List(10, 20, 30) - val unflatten = Unflatten3D(outShape) - it("should have inputShape List(0)") { - assert(unflatten.inputShape == List(0)) - } - it("should have outputShape as provided") { - assert(unflatten.outputShape == outShape) - } - it("should accept a new input shape when provided") { - val reshapedUnflatten = unflatten.reshapeInput(List(10 * 20 * 30)) - assert(reshapedUnflatten.inputShape == List(10 * 20 * 30)) - } - it("should not compile if input shape is not set properly") { - assertThrows[java.lang.IllegalArgumentException] { - unflatten.compile - } - } - it("should become a DL4J InputPreProcessor when compiled correctly") { - val reshapedUnflatten = unflatten.reshapeInput(List(10 * 20 * 30)) - val compiledUnflatten = reshapedUnflatten.compile - assert(compiledUnflatten.isInstanceOf[InputPreProcessor]) - } - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/models/NeuralNetTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/models/NeuralNetTest.scala deleted file mode 100644 index 105c67ecf..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/models/NeuralNetTest.scala +++ /dev/null @@ -1,55 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.models - -import org.deeplearning4j.scalnet.layers.core.{ Dense, OutputLayer } -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction -import org.scalatest.{ BeforeAndAfter, FunSpec } - -/** - * Created by maxpumperla on 19/07/17. - */ -class NeuralNetTest extends FunSpec with BeforeAndAfter { - - var model: NeuralNet = NeuralNet() - val shape = 100 - - before { - model = NeuralNet() - } - - describe("A NeuralNet network") { - - it("without layers should produce an IllegalArgumentException when compiled") { - assertThrows[java.lang.IllegalArgumentException] { - model.compile(null) - } - } - it("without buildOutput called should not have an output layer") { - model.add(Dense(shape, shape)) - assert(!model.getLayers.last.asInstanceOf[OutputLayer].output.isOutput) - } - - it("with buildOutput called should have an output layer") { - model.add(Dense(shape, shape)) - model.buildOutput(LossFunction.NEGATIVELOGLIKELIHOOD) - assert(model.getLayers.last.asInstanceOf[OutputLayer].output.isOutput) - } - - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/models/SequentialTest.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/models/SequentialTest.scala deleted file mode 100644 index aee3382fa..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/models/SequentialTest.scala +++ /dev/null @@ -1,91 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.models - -import org.deeplearning4j.scalnet.layers.convolutional.Convolution2D -import org.deeplearning4j.scalnet.layers.core.{ Dense, OutputLayer } -import org.deeplearning4j.scalnet.layers.pooling.MaxPooling2D -import org.deeplearning4j.scalnet.layers.reshaping.{ Flatten3D, Unflatten3D } -import org.deeplearning4j.scalnet.regularizers.L2 -import org.nd4j.linalg.activations.Activation -import org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction -import org.scalatest._ - -/** - * Created by maxpumperla on 29/06/17. - */ -class SequentialTest extends FunSpec with BeforeAndAfter { - - var model: Sequential = Sequential() - val shape = 100 - val wrongInputShape = 10 - - val height: Int = 28 - val width: Int = 28 - val channels: Int = 1 - val nClasses: Int = 10 - - val weightDecay: Double = 0.005 - - before { - model = Sequential() - } - - describe("A Sequential network") { - - it("without layers should produce an IllegalArgumentException when compiled") { - assertThrows[java.lang.IllegalArgumentException] { - model.compile(null) - } - } - - it("without buildOutput called should not have an output layer") { - model.add(Dense(shape, shape)) - assert(!model.getLayers.last.asInstanceOf[OutputLayer].output.isOutput) - } - - it("with buildOutput called should have an output layer") { - model.add(Dense(shape, shape)) - model.buildOutput(LossFunction.NEGATIVELOGLIKELIHOOD) - assert(model.getLayers.last.asInstanceOf[OutputLayer].output.isOutput) - } - - it("should infer the correct shape of an incorrectly initialized layer") { - model.add(Dense(shape, shape)) - model.add(Dense(shape, wrongInputShape)) - assert(model.getLayers.last.inputShape == List(shape)) - } - - it("should propagate the correct shape of all layers and preprocessors") { - model.add(Unflatten3D(List(height, width, channels), nIn = height * width)) - model.add(Convolution2D(20, List(5, 5), channels, regularizer = L2(weightDecay), activation = Activation.RELU)) - model.add(MaxPooling2D(List(2, 2), List(2, 2))) - - model.add(Convolution2D(50, List(5, 5), regularizer = L2(weightDecay), activation = Activation.RELU)) - model.add(MaxPooling2D(List(2, 2), List(2, 2))) - model.add(Flatten3D()) - - val preprocessorOutShapes = model.getPreprocessors.values.map(_.outputShape) - assert(preprocessorOutShapes == List(List(height, width, channels), List(4 * 4 * 50))) - - val layerOutShapes = model.getLayers.map(_.outputShape) - assert(layerOutShapes == List(List(24, 24, 20), List(12, 12, 20), List(8, 8, 50), List(4, 4, 50))) - - } - } -} diff --git a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/utils/SequenceGenerator.scala b/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/utils/SequenceGenerator.scala deleted file mode 100644 index 3702a5843..000000000 --- a/contrib/attic/scalnet/src/test/scala/org/deeplearning4j/scalnet/utils/SequenceGenerator.scala +++ /dev/null @@ -1,42 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j.scalnet.utils - -import org.nd4j.linalg.dataset.DataSet -import org.nd4j.linalg.factory.Nd4j - -/** - * A sequence generator that output toy examples for sequence classification - * Features are ramdom values between 0 and 1 and class is based on whether - * cumulative sum crossed 'timesteps * threshold'. - * ie. for 10 timesteps and 0.25 threshold: [0.6 0.2 0.9 0.9 0.3 0.6 0.8 0.1 0.8 0.2] [0 0 0 1 1 1 1 1 1 1] - */ -object SequenceGenerator { - - def generate(timesteps: Int, threshold: Double = 0.25): DataSet = { - val x = Nd4j.rand(1, timesteps) - val y = Nd4j.create(1, timesteps) - for (i <- 0 until timesteps) { - val cumulativeSum = Nd4j.cumsum(x.getRow(0), 1) - val limit = timesteps * threshold - y.putScalar(0, i, if (cumulativeSum.getDouble(0l, i) > limit) 1 else 0) - } - new DataSet(x.reshape(1, timesteps, 1), y.reshape(1, timesteps, 1)) - } - -} diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/Namespace.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/Namespace.java index 93dffaea5..ae5c79972 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/Namespace.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/Namespace.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen; import org.nd4j.codegen.api.NamespaceOps; diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/cli/CLI.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/cli/CLI.java index 8ef386604..8d659cc7b 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/cli/CLI.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/cli/CLI.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.cli; import com.beust.jcommander.*; diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/cpp/CppGenerator.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/cpp/CppGenerator.java index 5e77c1b61..0ae3538b4 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/cpp/CppGenerator.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/cpp/CppGenerator.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.impl.cpp; import org.apache.commons.io.FileUtils; diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/DocsGenerator.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/DocsGenerator.java index 00854e8cf..d90d20d49 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/DocsGenerator.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/DocsGenerator.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.impl.java; import com.squareup.javapoet.MethodSpec; diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/JavaPoetGenerator.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/JavaPoetGenerator.java index e671f2d37..7f445ed4b 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/JavaPoetGenerator.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/JavaPoetGenerator.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.impl.java; import org.apache.commons.lang3.StringUtils; diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/Nd4jNamespaceGenerator.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/Nd4jNamespaceGenerator.java index 383304292..cf3ae44e9 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/Nd4jNamespaceGenerator.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/java/Nd4jNamespaceGenerator.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.impl.java; import com.squareup.javapoet.*; diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/python/PythonGenerator.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/python/PythonGenerator.java index 0b99465be..cf627a51b 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/python/PythonGenerator.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/impl/python/PythonGenerator.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.impl.python; import org.apache.commons.io.FileUtils; diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/ir/SerializationTest.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/ir/SerializationTest.java index 1322d8fc4..f41bd93a6 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/ir/SerializationTest.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/ir/SerializationTest.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ir; public class SerializationTest { diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/util/GenUtil.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/util/GenUtil.java index 80491faf1..c92df439f 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/util/GenUtil.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/util/GenUtil.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.util; import org.nd4j.codegen.api.Op; diff --git a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/util/JsonMapper.java b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/util/JsonMapper.java index d475ccb4c..6728a290a 100644 --- a/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/util/JsonMapper.java +++ b/contrib/codegen-tools/codegen/src/main/java/org/nd4j/codegen/util/JsonMapper.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.util; diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/CodeComponent.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/CodeComponent.kt index 4a7e9328a..d0cb1213a 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/CodeComponent.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/CodeComponent.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api enum class CodeComponent { CLASS_DOC, CONSTRUCTOR, OP_CREATOR } \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/DataType.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/DataType.kt index 89d8c6a34..bf77e7a08 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/DataType.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/DataType.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api enum class DataType { diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Language.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Language.kt index a1c7bc6dc..7ca3fe673 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Language.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Language.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api enum class Language { ANY, JAVA, SCALA, KOTLIN, PYTHON, CPP } \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/LossReduce.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/LossReduce.kt index 9c0276603..054a12c29 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/LossReduce.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/LossReduce.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api /** diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Namespace.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Namespace.kt index 528fa447c..39e7cbb63 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Namespace.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Namespace.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api enum class Namespace { diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/NamespaceOps.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/NamespaceOps.kt index d07d8f632..eb5e756f2 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/NamespaceOps.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/NamespaceOps.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api data class NamespaceOps @JvmOverloads constructor( diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Op.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Op.kt index 760952ce4..97799ee45 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Op.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Op.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api import org.nd4j.codegen.api.doc.DocSection diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Registry.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Registry.kt index 5e1de4e13..15ed9292c 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Registry.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Registry.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api object Registry { diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Variables.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Variables.kt index 320aab78e..f8efa9677 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Variables.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/Variables.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api import org.nd4j.codegen.api.doc.DocSection diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocScope.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocScope.kt index 81f48cc2a..63c260d64 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocScope.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocScope.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api.doc import org.nd4j.codegen.api.CodeComponent diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocSection.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocSection.kt index a1f592620..5d15f3d25 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocSection.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocSection.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api.doc import org.nd4j.codegen.api.CodeComponent diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocTokens.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocTokens.kt index 5163283b8..47144b0a3 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocTokens.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/doc/DocTokens.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api.doc import org.nd4j.codegen.api.Op diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/ConstraintCodeGenerator.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/ConstraintCodeGenerator.kt index 2a50ac1f5..bfa75d800 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/ConstraintCodeGenerator.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/ConstraintCodeGenerator.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api.generator import org.nd4j.codegen.api.Expression diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/Generator.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/Generator.kt index abf1c52cb..9e0e60125 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/Generator.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/Generator.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api.generator import org.nd4j.codegen.api.Language diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/GeneratorConfig.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/GeneratorConfig.kt index afb2f4789..47cf6e2ec 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/GeneratorConfig.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/api/generator/GeneratorConfig.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.api.generator import org.nd4j.codegen.api.Op diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/dsl/OpBuilder.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/dsl/OpBuilder.kt index 40f0cf171..f36784830 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/dsl/OpBuilder.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/dsl/OpBuilder.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.dsl import org.nd4j.codegen.api.* diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/impl/java/JavaConstraintCodeGenerator.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/impl/java/JavaConstraintCodeGenerator.kt index 587816888..bf11b822e 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/impl/java/JavaConstraintCodeGenerator.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/impl/java/JavaConstraintCodeGenerator.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.impl.java import org.nd4j.codegen.api.* diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/impl/python/KotlinExamplePythonGenerator.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/impl/python/KotlinExamplePythonGenerator.kt index a26f264b7..df9882cf3 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/impl/python/KotlinExamplePythonGenerator.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/impl/python/KotlinExamplePythonGenerator.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.impl.python import org.apache.commons.io.FileUtils diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/FrameworkImporter.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/FrameworkImporter.kt deleted file mode 100644 index bb551a834..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/FrameworkImporter.kt +++ /dev/null @@ -1,43 +0,0 @@ -package org.nd4j.codegen.ir - -import com.google.common.reflect.TypeToken -import org.apache.commons.lang3.reflect.TypeUtils -import org.nd4j.autodiff.samediff.SameDiff -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.tensorflow.framework.* -import kotlin.reflect.KClass - -class FrameworkImporter(inputFrameworkName: String,nodeType: String,graphType: String,opDefType: String,tensorType: String,dataType: String,attributeType: String,attributeValueType: String) { - - val graphType = graphType - val nodeType = nodeType - val tensorType = tensorType - val dataType = dataType - val attributeType = attributeType - val attributeValueType = attributeValueType - val opDefType = opDefType - val inputFrameworkName = inputFrameworkName - - fun runImport(): SameDiff { - val outputType = TypeUtils.parameterize( - OpMappingRegistry::class.java, Class.forName(graphType), Class.forName(nodeType), Class.forName(opDefType), - Class.forName(tensorType), Class.forName(dataType), Class.forName(attributeType), Class.forName(attributeValueType)) - - val importGraphParameterized = TypeUtils.parameterize(ImportGraph::class.java, - Class.forName(graphType), - Class.forName(nodeType), - Class.forName(opDefType), - Class.forName(tensorType), - Class.forName(attributeType), - Class.forName(attributeValueType), - Class.forName(dataType)) - - val rawRegistryType = TypeToken.of(outputType).rawType - val rawRegistryInstance = rawRegistryType.getConstructor(String::class.java).newInstance(inputFrameworkName) - val importGraphType = TypeToken.of(importGraphParameterized).rawType - val importGraphInstance = importGraphType.getConstructor().newInstance() - return SameDiff.create() - } - - -} \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/IR.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/IR.kt deleted file mode 100644 index d0e65acaf..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/IR.kt +++ /dev/null @@ -1,1488 +0,0 @@ -package org.nd4j.codegen.ir - -import io.github.classgraph.ClassGraph -import org.apache.commons.io.IOUtils -import org.nd4j.autodiff.functions.DifferentialFunction -import org.nd4j.autodiff.samediff.SDVariable -import org.nd4j.autodiff.samediff.SameDiff -import org.nd4j.autodiff.samediff.VariableType -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.nd4j.codegen.ir.registry.OpRegistryHolder -import org.nd4j.codegen.ir.tensorflow.* -import org.nd4j.common.io.ClassPathResource -import org.nd4j.common.io.ReflectionUtils -import org.nd4j.common.util.ArrayUtil -import org.nd4j.gen.OpDeclarationDescriptor -import org.nd4j.ir.MapperNamespace -import org.nd4j.ir.OpNamespace -import org.nd4j.ir.TensorNamespace -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.api.ops.DynamicCustomOp -import org.nd4j.linalg.api.ops.Op -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.shade.protobuf.ByteString -import org.nd4j.shade.protobuf.GeneratedMessageV3 -import org.nd4j.shade.protobuf.ProtocolMessageEnum -import org.nd4j.shade.protobuf.TextFormat -import java.lang.IllegalArgumentException -import java.lang.reflect.Modifier -import java.nio.ByteBuffer -import java.nio.charset.Charset -import kotlin.collections.ArrayList -import kotlin.collections.HashMap - - -fun loadNd4jOpDescriptors(): OpNamespace.OpDescriptorList { - val nd4jOpDescriptorResourceStream = ClassPathResource("nd4j-op-defs-2.proto").inputStream - val resourceString = IOUtils.toString(nd4jOpDescriptorResourceStream, Charset.defaultCharset()) - val descriptorListBuilder = OpNamespace.OpDescriptorList.newBuilder() - TextFormat.merge(resourceString,descriptorListBuilder) - val ret = descriptorListBuilder.build() - val mutableList = ArrayList(ret.opListList) - mutableList.sortBy { it.name } - - val newResultBuilder = OpNamespace.OpDescriptorList.newBuilder() - newResultBuilder.addAllOpList(mutableList) - return newResultBuilder.build() -} - -fun nd4jDifferentialFunctions(): List> { - return ClassGraph().enableAllInfo() - .scan().getSubclasses("org.nd4j.autodiff.functions.DifferentialFunction").filter { - clazz-> !clazz.isAbstract && !clazz.isAnnotation && !clazz.isInterface - }.map { clazz -> Class.forName(clazz.name) as Class } -} - -val differentialFunctionClasses = nd4jDifferentialFunctions() - -fun cachedOpInstances2(): List { - return differentialFunctionClasses.map { clazz -> clazz.newInstance() as DifferentialFunction}.filter { - it.opName() != null - } -} - -val cachedOpInstances = cachedOpInstances2() - - -fun createDifferentialFunctionInstanceForName(name: String): DifferentialFunction { - return cachedOpInstances.first { op -> op.opName() == name }.javaClass.newInstance() -} - -fun isOutputFrameworkAttributeName(name: String,opDescriptor: OpNamespace.OpDescriptor): Boolean { - return opDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.argType != OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR - && argDescriptor.argType != OpNamespace.ArgDescriptor.ArgType.OUTPUT_TENSOR } - .map { inputArg -> inputArg.name }.contains(name) -} - -fun isNd4jTensorName(name: String,opDescriptor: OpNamespace.OpDescriptor): Boolean { - return opDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - .map { inputArg -> inputArg.name } - .contains(name) -} - - -fun argDescriptorType(name: String, opDescriptor: OpNamespace.OpDescriptor): OpNamespace.ArgDescriptor.ArgType { - return opDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.name == name }[0].argType -} - -val nd4jOpDescriptors = loadNd4jOpDescriptors() - -fun OpNamespace.OpDescriptorList.findOp(opName: String): OpNamespace.OpDescriptor { - return this.opListList.first { it.name == opName } -} - - -interface IRTensor - where DATA_TYPE: ProtocolMessageEnum { - fun shape(): List - fun stride(): List - fun dataType(): IRDataType - fun toArgTensor(): TensorNamespace.TensorProto - fun rawValue(): TENSOR_TYPE - fun toNd4jNDArray(): INDArray - -} - - -enum class AttributeValueType { - FLOAT, - LIST_FLOAT, - INT, - LIST_INT, - BOOL, - LIST_BOOL, - STRING, - LIST_STRING, - TENSOR, - LIST_TENSOR, - DATA_TYPE, - INVALID -} - -interface IRAttribute { - - fun name(): String - - fun floatValue(): Float - - fun listFloatValue(): List - - fun tensorValue(): IRTensor - - fun listTensorValue(): List> - - fun intValue(): Long - - fun listIntValue(): List - - fun boolValue(): Boolean - - fun listBoolValue(): List - - fun stringValue(): String - - fun listStringValue(): List - - fun attributeValueType(): AttributeValueType - - fun dataTataTypeValue(): IRDataType - - fun internalAttributeDef(): ATTRIBUTE_TYPE - - - fun internalAttributeValue(): ATTRIBUTE_VALUE_TYPE -} - - - -interface MappingProcess< - GRAPH_TYPE: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_DEF_TYPE: GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, - ATTRIBUTE_TYPE : GeneratedMessageV3, - ATTRIBUTE_VALUE_TYPE : GeneratedMessageV3, - DATA_TYPE: ProtocolMessageEnum> { - - - - fun inputOpDefValueTypes(): Map - - fun opName(): String - - fun frameworkVersion(): String - - fun inputFramework(): String - - fun inputFrameworkOpName(): String - - fun attributeMappingRules(): List> - - fun tensorMappingRules(): List> - - fun applyProcess(mappingCtx: MappingContext): - Pair, OpNamespace.OpDescriptor> - - fun applyProcessReverse(input: OpDeclarationDescriptor): IRNode - - - fun indexOverrides() : Map - - fun serialize(): MapperNamespace.MapperDeclaration - - -} - - -interface TensorMappingRule - where DATA_TYPE: ProtocolMessageEnum { - - - fun initWithMappingProcess(mappingProcess: MappingProcess) - - - fun name(): String - - - fun serialize(): MapperNamespace.MappingRule - - - fun mappingNamesToPerform(): Map - - /** - * Convert 1 or more attributes in to a list of {@link ArgDescriptor} - */ - fun convertInput(mappingContext: MappingContext): List - - - fun inputArgumentMappings(): Map - - fun convertInputsReverse(toReverse: List): List - - fun isInputTensorName(inputName: String): Boolean - - fun isOutputTensorName(outputName: String): Boolean - -} - - -interface AttributeMappingRule - where DATA_TYPE: ProtocolMessageEnum { - - fun initWithMappingProcess(mappingProcess: MappingProcess) - - fun mappingNamesToPerform(): Map - - fun mappingTransformerArgs(): Map> - - fun name(): String - - fun serialize(): MapperNamespace.MappingRule - - fun convertAttributes(mappingCtx: MappingContext): List - - fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> - - - fun isInputFrameworkTensorName(name: String,mappingProcess: MappingProcess): Boolean - - fun isNd4jTensorName(name: String,mappingProcess: MappingProcess): Boolean - - fun isInputFrameworkAttributeName(name: String,mappingProcess: MappingProcess): Boolean - - fun isOutputFrameworkAttributeName(name: String,mappingProcess: MappingProcess): Boolean - - fun argDescriptorType(name: String,mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType - - fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean - - fun outputsType(argDescriptorType: List): Boolean - - fun attributeValueTypeFor(name: String,mappingProcess: MappingProcess): AttributeValueType - - fun argDescriptorTypesForOutputName( - name: String, mappingProcess: - MappingProcess): List -} - - -interface MappingContext { - /** - * Whether to resolve dynamic place holder variables where - * scalar values are present. An example scenario is when a value is an input ndarray - * such as pow(..) where 1 is always an ndarray and the other is a scalar value - * represented as a double argument in nd4j, but might be a placeholder - * in the input framework. - */ - fun resolveDynamic(): Boolean - - /** - * Input variables for dynamic resolution required for import. - * This is important for any cases where a placeholder variable - * can be imported and resolved dynamically and later passed on as scalars. - */ - fun dynamicResolutionVariables(): Map - - fun node(): NODE_TYPE - - fun irNode(): IRNode - - fun opDef(): OP_DEF_TYPE - - fun opName(): String - - fun nodeName(): String - - fun attrDef(name: String): ATTRIBUTE_TYPE - - fun tensorInputFor(name: String): IRTensor - - fun tensorInputFromInputFrameworkName(name: String): IRTensor - - fun tensorAttributeFor(name: String): IRTensor - - - fun createIRTensorFromNDArray(ndaray:INDArray): IRTensor - - fun nd4jDataTypeFor(input: IRTensor): DataType - - fun irAttributeValueForNode(valueName: String): IRAttribute - - fun argDescriptorTypeForName(nd4jName: String): List - - fun graph(): IRGraph - - fun nd4jOpName(): String - -} - -abstract class AbstractMappingContext( - opDef: OP_DEF_TYPE, - node: NODE_TYPE, - graph: - IRGraph, - dynamicVariables: Map = emptyMap()): - MappingContext { - - val opDef = opDef - val node = node - val graph = graph - val dynamicVariables: Map = dynamicVariables - - override fun dynamicResolutionVariables(): Map { - return dynamicVariables - } - - override fun resolveDynamic(): Boolean { - return dynamicVariables.isNotEmpty() - } - - override fun node(): NODE_TYPE { - return node - } - - override fun opDef(): OP_DEF_TYPE { - return opDef - } - - override fun graph(): IRGraph { - return graph - } - - override fun argDescriptorTypeForName(nd4jName: String): List { - val opDescriptor = nd4jOpDescriptors.findOp(graph.nd4jNameForInternalOpName(opName())) - return opDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.name == nd4jName }.map { argDescriptor -> argDescriptor.argType } - } - - override fun nd4jOpName(): String { - return nd4jOpDescriptors.findOp(graph.nd4jNameForInternalOpName(opName())).name - } -} - - -interface IRGraphRunner< - GRAPH_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - TENSOR_TYPE: GeneratedMessageV3, - ATTRIBUTE_TYPE: GeneratedMessageV3, - ATTRIBUTE_VALUE_TYPE: GeneratedMessageV3, - DATA_TYPE : ProtocolMessageEnum> { - - fun graph(): IRGraph - - fun run(inputs: Map): Map -} - - -interface IRGraph< - GRAPH_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - TENSOR_TYPE: GeneratedMessageV3, - ATTRIBUTE_TYPE: GeneratedMessageV3, - ATTRIBUTE_VALUE_TYPE: GeneratedMessageV3, - DATA_TYPE : ProtocolMessageEnum> { - - fun importInfoForEachNode(dynamicVariables: Map): Map, OpNamespace.OpDescriptor>> - - fun shapeOfInput(varName: String): LongArray? - - fun dataTypeForVariable(varName: String): IRDataType - - fun isConstant(opName: String): Boolean - - fun nodeIsPlaceHolder(nodeName: String): Boolean - - fun isPlaceHolder(opName: String): Boolean - - fun isConstantOpName(name: String): Boolean - - fun nodeByName(input: String): NODE_TYPE - - fun nodeList(): List> - - fun internalValue(): GRAPH_TYPE - - fun createMappingContext( - opDef: OP_DEF_TYPE, - node: NODE_TYPE, - dynamicVariables: Map - ): MappingContext - - fun frameworkName(): String - - fun nd4jNameForInternalOpName(name: String): String -} - - - -fun importInfoForEachNodeInGraph ( - graph: IRGraph, - dynamicVariables: Map) - : Map,OpNamespace.OpDescriptor>> { - - val opMappingRegistry = OpRegistryHolder.opMappingRegistryForName(graph.frameworkName()) - - val ret = HashMap,OpNamespace.OpDescriptor>>() - - graph.nodeList().forEach { node -> - val name = node.nodeName() - val opMappingProcess = OpRegistryHolder.lookupOpMappingProcess< - GRAPH_TYPE, - NODE_TYPE, - OP_DEF_TYPE, - TENSOR_TYPE, - DATA_TYPE, - ATTRIBUTE_TYPE, - ATTRIBUTE_VALUE_TYPE>(inputFrameworkOpName = node.opName(), inputFrameworkName = graph.frameworkName()) - val opDefLookup = opMappingRegistry.lookupInputFrameworkOpDef(node.opName()) - val mappingContext = graph.createMappingContext( - opDef = opDefLookup, - node = graph.nodeByName(node.nodeName()), - dynamicVariables = dynamicVariables - ) - - val applied = opMappingProcess.applyProcess(mappingContext) - ret[name] = applied - } - - return ret -} - -interface IRNode - where DATA_TYPE: ProtocolMessageEnum { - - - fun nd4jInputs(tensorMappings: Map): List - - fun computeAdjustedOffsetForInput( - nd4jName: String, - inputFrameworkName: String, - tensorInputMappings: Map - ): Int - - /** - * Get the list of inputs from the node that represent a particular - * [OpDef] input list name. - */ - fun inputNamesForListOfInputValues(inputListName: String): List - - /** - * Compute the number of inputs - * for a list of tensors that reflect 1 or more inputs - * as 1 name. - */ - fun numInputsForListOfTensors(name: String): Int - - /** - * List of inputs in to the node - * @return the list of input names for this node - */ - fun createInputsFrom(inputData: List): List> - - /** - * List of outputs - * @return the list of output names for this node - */ - fun createOutputsFrom(inputValues: List): List> - - /** - * Op name - */ - fun opName(): String - - /** - * The name of the node - * @return the name of the node - */ - fun nodeName(): String - - /** - * List of input names - */ - fun inputs(): List - - /** - * List of output names - */ - fun outputs(): List - - /** - * The input at a particular index - * @return the name at the particular index - */ - fun inputAt(index: Int): String - fun outputAt(index: Int): String - - fun numInputs(): Int - - fun numOutputs(): Int - - fun attributeMap(): Map> - fun getAttribute(inputName: String): IRAttribute - fun hasAttribute(inputName: String): Boolean - - fun internalValue(): NODE_TYPE -} - -interface IRArgDef - where DATA_TYPE: ProtocolMessageEnum { - fun name(): String - - fun description(): String - - fun dataType(): IRDataType - - fun internalValue(): T - - fun indexOf(): Integer -} - -interface IROpDef< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, - ARG_DEF_TYPE : GeneratedMessageV3, DATA_TYPE, - ATTRIBUTE_TYPE : GeneratedMessageV3, - ATTRIBUTE_VALUE_TYPE : GeneratedMessageV3> - where DATA_TYPE: ProtocolMessageEnum { - fun opName(): String - - fun internalValue(): OP_DEF_TYPE - - fun inputArgs(): List> - - fun outputArgs(): List> - - fun attributes(): List> - -} - -enum class IRDataTypeValue { - DT_FLOAT, - DT_DOUBLE, - DT_INT32, - DT_UINT8, - DT_INT16, - DT_INT8, - DT_STRING, - DT_COMPLEX64, // Single-precision complex - DT_INT64, - DT_BOOL, - DT_QINT8, // Quantized int8 - DT_QUINT8, // Quantized uint8 - DT_QINT32, // Quantized int32 - DT_BFLOAT16, // Float32 truncated to 16 bits. Only for cast ops. - DT_QINT16, // Quantized int16 - DT_QUINT16, // Quantized uint16 - DT_UINT16, - DT_COMPLEX128, // Double-precision complex - DT_HALF, - DT_RESOURCE, - DT_VARIANT, // Arbitrary C++ data types - DT_UINT32, - DT_UINT64, - DT_INVALID - -} - -interface IRDataType where DATA_TYPE: ProtocolMessageEnum { - fun convertToDataType(input: DATA_TYPE): IRDataTypeValue - - fun dataType(): IRDataTypeValue - - fun internalValue(): DATA_TYPE - - fun nd4jDataType(): DataType - - fun nameSpaceDataType(): TensorNamespace.DataType -} - - - -abstract class AbstractMappingProcess< - GRAPH_TYPE: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, - ATTRIBUTE_TYPE : GeneratedMessageV3, - ATTRIBUTE_VALUE_TYPE : GeneratedMessageV3, DATA_TYPE: ProtocolMessageEnum>(inputFramework: String, - frameworkVersion: String, - inputFrameworkOpName: String, - inputIndexOverrides: Map = emptyMap(), - opName: String, - opMappingRegistry: OpMappingRegistry, - tensorMappingRules: List>, - attributeMappingRules: List>): - MappingProcess { - - protected val inputFramework = inputFramework - protected val frameworkVersion = frameworkVersion - protected val inputFrameworkOpName = inputFrameworkOpName - protected val opName = opName - protected val tensorMappingRules = tensorMappingRules - protected val attributeMappingRules = attributeMappingRules - protected var opDef: IROpDef? = null - protected val opMappingRegistry = opMappingRegistry - protected val inputIndexOverrides = inputIndexOverrides - init { - tensorMappingRules.forEach { tensorMappingRule -> - tensorMappingRule.initWithMappingProcess(this) - tensorMappingRule.mappingNamesToPerform().forEach { (nd4jName, inputFrameworkName) -> - if(!tensorMappingRule.isInputTensorName(inputFrameworkName)) { - throw IllegalArgumentException("Found invalid input tensor named ${inputFrameworkName} for rule ${tensorMappingRule.name()} and mapping process for op ${opName} and input framework name ${inputFrameworkOpName} with definition being ${nd4jOpDescriptors.findOp(opName)}") - } - - if(!tensorMappingRule.isOutputTensorName(nd4jName)) { - throw IllegalArgumentException("Found invalid output tensor named ${nd4jName} for rule ${tensorMappingRule.name()} and mapping process for op ${opName} and input framework name ${inputFrameworkOpName} with definition being ${nd4jOpDescriptors.findOp(opName)}") - } - - } - } - - attributeMappingRules.forEach { - it.initWithMappingProcess(this) - attributeMappingRules.forEach { attributeMappingRule -> - attributeMappingRule.mappingNamesToPerform().forEach { (nd4jName, inputFrameworkName) -> - val inputType = attributeMappingRule.attributeValueTypeFor(inputFrameworkName,this) - if(!attributeMappingRule.acceptsInputType(inputType)) { - throw IllegalArgumentException("Rule ${attributeMappingRule.name()} for framework $inputFramework does not accept input type ${inputType} for attribute name ${inputFrameworkName} and mapping process for op ${opName} and input framework name ${inputFrameworkOpName}") - } - - val outputType = attributeMappingRule.argDescriptorTypesForOutputName(nd4jName,this) - if(!attributeMappingRule.outputsType(outputType)) { - throw IllegalArgumentException("Rule ${attributeMappingRule.name()} for framework $inputFramework with input framework name $inputFrameworkName does not accept output type ${outputType} for attribute name ${nd4jName} and mapping process for op ${opName}") - } - - } - } - } - - - opMappingRegistry.registerMappingProcess( - inputFrameworkOpName = inputFrameworkOpName, - processToRegister = this - ) - - - } - - override fun indexOverrides(): Map { - return inputIndexOverrides - } - - override fun attributeMappingRules(): List> { - return attributeMappingRules - } - - override fun tensorMappingRules(): List> { - return tensorMappingRules - } - - override fun applyProcessReverse(input: OpDeclarationDescriptor): IRNode { - TODO("Not yet implemented") - } - - override fun inputFrameworkOpName(): String { - return inputFrameworkOpName - } - - override fun opName(): String { - return opName - } - - override fun frameworkVersion(): String { - return frameworkVersion - } - - override fun inputFramework(): String { - return inputFramework - } - - override fun applyProcess(mappingCtx: MappingContext): Pair, OpNamespace.OpDescriptor> { - val descriptorBuilder = OpNamespace.OpDescriptor.newBuilder() - descriptorBuilder.name = opName() - tensorMappingRules.forEach { - it.convertInput(mappingCtx).forEach { descriptor -> - descriptorBuilder.addArgDescriptor(descriptor) - } - } - - - attributeMappingRules.forEach { - it.convertAttributes(mappingCtx).forEach { - descriptor -> descriptorBuilder.addArgDescriptor(descriptor) - } - } - - val fullDescriptor = nd4jOpDescriptors.findOp(opName()) - descriptorBuilder.opDeclarationType = fullDescriptor.opDeclarationType - - return Pair(mappingCtx,descriptorBuilder.build()) - } - - override fun serialize(): MapperNamespace.MapperDeclaration { - val retBuilder = MapperNamespace.MapperDeclaration.newBuilder() - retBuilder.frameworkName = inputFramework() - retBuilder.opName = opName() - - - /** - * TODO: add input index overrides - */ - - indexOverrides().forEach { indexToOverride, replacementIndex -> - retBuilder.putIndexOverrides(indexToOverride.toLong(),replacementIndex.toLong()) - } - - tensorMappingRules.forEach { - retBuilder.addRule(it.serialize().toBuilder()) - } - - attributeMappingRules.forEach { - retBuilder.addRule(it.serialize().toBuilder()) - } - - return retBuilder.build() - } -} - - -fun ArgDescriptor(block: OpNamespace.ArgDescriptor.Builder.() -> Unit): OpNamespace.ArgDescriptor { - return OpNamespace.ArgDescriptor.newBuilder() - .apply(block).build() -} - -fun NameSpaceTensor(block: TensorNamespace.TensorProto.Builder.() -> Unit): TensorNamespace.TensorProto { - return TensorNamespace.TensorProto.newBuilder() - .apply(block).build() -} - - - -fun TensorNamespace.TensorProto.Builder.RawData(rawData: ByteArray) { - this.rawData = ByteString.copyFrom(rawData) -} - -fun TensorNamespace.TensorProto.Builder.IntData(intData: List) { - this.addAllInt32Data(intData) -} - -fun TensorNamespace.TensorProto.Builder.FloatData(floatData: List) { - this.addAllFloatData(floatData) -} - -fun TensorNamespace.TensorProto.Builder.DoubleData(doubleData: List) { - this.addAllDoubleData(doubleData) -} - -fun TensorNamespace.TensorProto.Builder.StringData(stringData: List) { - this.addAllStringData(stringData.map { input -> ByteString.copyFrom(input.toByteArray(Charset.defaultCharset())) }) -} - -fun TensorNamespace.TensorProto.Builder.Int64Data(intData: List) { - this.addAllInt64Data(intData) -} - -fun TensorNamespace.TensorProto.Builder.Dims(shape: List) { - shape.forEach { this.addDims(it) } -} - - -fun convertNd4jDataTypeFromNameSpaceTensorDataType(dataType: TensorNamespace.DataType): DataType { - return when(dataType) { - TensorNamespace.DataType.UINT32 -> return DataType.UINT32 - TensorNamespace.DataType.INT64 -> return DataType.INT64 - TensorNamespace.DataType.INT16 -> return DataType.INT16 - TensorNamespace.DataType.UINT64 -> return DataType.UINT64 - TensorNamespace.DataType.DOUBLE -> return DataType.DOUBLE - TensorNamespace.DataType.FLOAT -> return DataType.FLOAT - TensorNamespace.DataType.FLOAT16 -> return DataType.FLOAT16 - TensorNamespace.DataType.FLOAT16 -> return DataType.FLOAT16 - TensorNamespace.DataType.INT32 -> return DataType.INT32 - TensorNamespace.DataType.STRING -> return DataType.UTF8 - TensorNamespace.DataType.BOOL -> return DataType.BOOL - TensorNamespace.DataType.BFLOAT16 -> return DataType.BFLOAT16 - TensorNamespace.DataType.INT8 -> return DataType.INT8 - TensorNamespace.DataType.UINT16 -> return DataType.UINT16 - TensorNamespace.DataType.UNDEFINED,TensorNamespace.DataType.UNRECOGNIZED -> return DataType.UNKNOWN - else -> { - throw IllegalArgumentException("Illegal data type $dataType") - } - } -} - -fun convertNameSpaceTensorDataTypeFromNd4jDataType(dataType: DataType): TensorNamespace.DataType { - return when(dataType) { - DataType.UINT32 -> return TensorNamespace.DataType.UINT32 - DataType.INT64,DataType.LONG -> return TensorNamespace.DataType.INT64 - DataType.UINT64 -> return TensorNamespace.DataType.UINT64 - DataType.DOUBLE -> return TensorNamespace.DataType.DOUBLE - DataType.FLOAT -> return TensorNamespace.DataType.FLOAT - DataType.FLOAT16,DataType.HALF -> return TensorNamespace.DataType.FLOAT16 - DataType.HALF -> return TensorNamespace.DataType.FLOAT16 - DataType.INT32,DataType.INT -> return TensorNamespace.DataType.INT32 - DataType.UTF8 -> return TensorNamespace.DataType.STRING - DataType.BOOL -> return TensorNamespace.DataType.BOOL - DataType.BFLOAT16 -> return TensorNamespace.DataType.BFLOAT16 - DataType.SHORT,DataType.INT8 -> return TensorNamespace.DataType.INT8 - DataType.UINT16 -> return TensorNamespace.DataType.UINT16 - DataType.BYTE,DataType.UINT8,DataType.UBYTE -> return TensorNamespace.DataType.UINT8 - else -> { - throw IllegalArgumentException("Illegal data type $dataType") - } - } -} - - -fun ndarrayFromNameSpaceTensor(inputTensor: TensorNamespace.TensorProto): INDArray { - val dtype = convertNd4jDataTypeFromNameSpaceTensorDataType(TensorNamespace.DataType.values()[inputTensor.dataType]) - val shape = inputTensor.dimsList.toLongArray() - when(dtype) { - DataType.FLOAT -> { - val floatArray = inputTensor.floatDataList.toFloatArray() - if(floatArray.isEmpty()) - return loadDataBufferFromRawData(inputTensor) - val dataBuffer = Nd4j.createBuffer(floatArray) - return Nd4j.create(dataBuffer).reshape(*shape) - } - DataType.DOUBLE -> { - val doubleArray = inputTensor.doubleDataList.toDoubleArray() - if(doubleArray.isEmpty()) - return loadDataBufferFromRawData(inputTensor) - - val dataBuffer = Nd4j.createBuffer(doubleArray) - return Nd4j.create(dataBuffer).reshape(*shape) - } - DataType.INT64 -> { - val longArray = inputTensor.int64DataList.toLongArray() - if(longArray.isEmpty()) - return loadDataBufferFromRawData(inputTensor) - val dataBuffer = Nd4j.createBuffer(longArray) - return Nd4j.create(dataBuffer).reshape(*shape) - } - DataType.INT32 -> { - val intArray = inputTensor.int32DataList.toIntArray() - if(intArray.isEmpty()) - return loadDataBufferFromRawData(inputTensor) - - val dataBuffer = Nd4j.createBuffer(intArray) - return Nd4j.create(dataBuffer).reshape(*shape) - } - - DataType.BOOL -> { - val intArray = inputTensor.int32DataList.toIntArray() - if(intArray.isEmpty()) - return loadDataBufferFromRawData(inputTensor) - - val dataBuffer = Nd4j.createBuffer(intArray) - return Nd4j.create(dataBuffer).reshape(*shape) - } - - DataType.UTF8 -> { - val stringList = inputTensor.stringDataList.map { input -> input.toStringUtf8() } - if(stringList.isEmpty()) - return loadDataBufferFromRawData(inputTensor) - - return Nd4j.create(stringList).reshape(*shape) - } - DataType.UNKNOWN -> { - val ret = Nd4j.empty() - return ret - } - - else -> { - return loadDataBufferFromRawData(inputTensor) - } - - } - - throw IllegalArgumentException("Illegal type found for conversion ${dtype}") -} - -fun loadDataBufferFromRawData(inputTensor: TensorNamespace.TensorProto): INDArray { - val shape = inputTensor.dimsList.toLongArray() - val dtype = convertNd4jDataTypeFromNameSpaceTensorDataType(TensorNamespace.DataType.values()[inputTensor.dataType]) - val byteArray = inputTensor.rawData.toByteArray() - //note: scalar can be zero - val totalLen = Math.max(ArrayUtil.prod(*shape),1) - val byteBuffer = ByteBuffer.allocateDirect(totalLen * dtype.width()) - byteBuffer.put(byteArray) - byteBuffer.rewind() - val rawDataBuffer = Nd4j.createBuffer(byteBuffer,dtype,totalLen,0) - return Nd4j.create(rawDataBuffer).reshape(*shape) -} - - - -fun nameSpaceTensorFromNDarray(ndarray:INDArray): TensorNamespace.TensorProto { - val nameSpaceDataType = convertNameSpaceTensorDataTypeFromNd4jDataType(ndarray.dataType()).ordinal - when(ndarray.dataType()) { - DataType.INT64 -> { - return NameSpaceTensor { - dataType = nameSpaceDataType - Int64Data(ndarray.data().asLong().toList()) - Dims(ndarray.shape().asList()) - } - } - - DataType.INT32 -> { - return NameSpaceTensor { - dataType = nameSpaceDataType - IntData(ndarray.data().asInt().toList()) - Dims(ndarray.shape().asList()) - } - } - - DataType.DOUBLE -> { - return NameSpaceTensor { - dataType = nameSpaceDataType - DoubleData(ndarray.data().asDouble().toList()) - Dims(ndarray.shape().asList()) - } - } - - DataType.FLOAT -> { - return NameSpaceTensor { - dataType = nameSpaceDataType - FloatData(ndarray.data().asFloat().toList()) - Dims(ndarray.shape().asList()) - } - } - - DataType.UTF8 -> { - val stringList = ArrayList() - for(i in 0 until ndarray.length()) { - stringList.add(ndarray.getString(i)) - } - - return NameSpaceTensor { - dataType = nameSpaceDataType - StringData(stringList) - Dims(ndarray.shape().asList()) - } - } - - else -> { - throw IllegalArgumentException("Illegal data type ${ndarray.dataType()}") - } - } - -} - - - - -interface ImportContext< - GRAPH_TYPE: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, - ATTRIBUTE_TYPE : GeneratedMessageV3, - ATTRIBUTE_VALUE_TYPE : GeneratedMessageV3, - DATA_TYPE: ProtocolMessageEnum> { - - fun process(): MappingProcess - - fun mappingContext(): MappingContext - -} - -abstract class AbstractImportContext< - GRAPH_TYPE: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, - ATTRIBUTE_TYPE : GeneratedMessageV3, - ATTRIBUTE_VALUE_TYPE : GeneratedMessageV3, - DATA_TYPE: ProtocolMessageEnum> - (process: MappingProcess, - mappingContext: MappingContext): - ImportContext -{ - val process = process - val mappingContext = mappingContext - - override fun process(): MappingProcess< - GRAPH_TYPE, - OP_DEF_TYPE, - NODE_TYPE, - TENSOR_TYPE, - ATTRIBUTE_TYPE, - ATTRIBUTE_VALUE_TYPE, - DATA_TYPE> { - return process - } - - override fun mappingContext(): MappingContext { - return mappingContext - } -} - - -interface ImportProcess< - GRAPH_TYPE: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, - ATTRIBUTE_TYPE : GeneratedMessageV3, - ATTRIBUTE_VALUE_TYPE : GeneratedMessageV3, - DATA_TYPE: ProtocolMessageEnum> { - - fun createMappingProcesses(graph: IRGraph) - : List> - - fun createMappingContext(graph: - IRGraph,node: IRNode): - MappingContext - - - fun createImportContext(mappingProcess: MappingProcess,mappingContext: - MappingContext) - : ImportContext - - fun runImportProcess(mappingProcesses: List>): SameDiff - -} - - -fun lookupIndexForArgDescriptor( - argDescriptorName: String, - opDescriptorName: String, - argDescriptorType: OpNamespace.ArgDescriptor.ArgType -): Int { - println("Trying to find arg descriptor index for op $opDescriptorName and descriptor name $argDescriptorName with type $argDescriptorType") - val op = nd4jOpDescriptors.findOp(opDescriptorName) - val names = op.argDescriptorList.map { argDescriptor -> argDescriptor.name } - if(!names.contains(argDescriptorName)) { - throw IllegalArgumentException("Invalid name $argDescriptorName for op $opDescriptorName passed in. $argDescriptorName not found in $opDescriptorName. Available names were ${names}") - } - val ret = op - .argDescriptorList.firstOrNull { argDescriptor -> argDescriptor.name == argDescriptorName && - argDescriptor.argType == argDescriptorType } - if(ret == null) - return -1 - else return ret.argIndex -} - -fun createVariable(varName: String,varType: VariableType,sameDiff: SameDiff,shape: List,dataType: DataType): SDVariable { - return SDVariable(varName,varType, sameDiff, shape.toLongArray(), dataType) -} - - -interface ImportRunner { - fun initAttributes( - df: DifferentialFunction, - frameworkName: String, - mappingContext: MappingContext, - sd: SameDiff, - inputFrameworkOpName: String) -} - - - - -class DefaultImportRunner : ImportRunner { - override fun initAttributes( - df: DifferentialFunction, - frameworkName: String, - mappingContext: MappingContext, - sd: SameDiff, - inputFrameworkOpName: String - ) { - - val opMappingProcess = OpRegistryHolder.lookupOpMappingProcess< - GRAPH_TYPE, - NODE_TYPE, - OP_DEF_TYPE, - TENSOR_TYPE, - DATA_TYPE, - ATTR_DEF_TYPE, - ATTR_VALUE_TYPE>(inputFrameworkOpName = inputFrameworkOpName, inputFrameworkName = frameworkName) - - val applied = opMappingProcess.applyProcess(mappingContext) - - when (df.opType()) { - Op.Type.CUSTOM -> { - val dynamicCustomOp = df as DynamicCustomOp - val grouped = applied.second.argDescriptorList.groupBy { descriptor -> - descriptor.argType - } - - val sortedMap = HashMap>() - grouped.forEach { (argType, list) -> - sortedMap[argType] = list.sortedBy { arg -> arg.argIndex } - } - - sortedMap.forEach { (argType, listOfArgsSortedByIndex) -> - when (argType) { - OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR -> { - val args = dynamicCustomOp.args() - val arraysToAdd = ArrayList() - listOfArgsSortedByIndex.forEachIndexed { index, argDescriptor -> - val convertedTensor = ndarrayFromNameSpaceTensor(argDescriptor.inputValue) - if (index < args.size) { - val arg = args[index] - if (arg.variableType != VariableType.ARRAY) { - if (arg.shape == null) { - val emptyLongArray = LongArray(0) - arg.setShape(*emptyLongArray) - } - - arraysToAdd.add(convertedTensor) - - } - } else { - sd.constant(sd.generateNewVarName(argDescriptor.name, 0), convertedTensor) - arraysToAdd.add(convertedTensor) - } - - } - - //note we don't add arrays one at a time because addInputArgument requires all the input arrays to be added at once - //dynamicCustomOp.addInputArgument(*arraysToAdd.toTypedArray()) - - - } - - - OpNamespace.ArgDescriptor.ArgType.INT64, OpNamespace.ArgDescriptor.ArgType.INT32 -> { - listOfArgsSortedByIndex.forEach { dynamicCustomOp.addIArgument(it.int64Value) } - } - - OpNamespace.ArgDescriptor.ArgType.DOUBLE, OpNamespace.ArgDescriptor.ArgType.FLOAT -> { - listOfArgsSortedByIndex.forEach { dynamicCustomOp.addTArgument(it.doubleValue) } - } - - OpNamespace.ArgDescriptor.ArgType.OUTPUT_TENSOR -> { - listOfArgsSortedByIndex.forEach { - val convertedTensor = ndarrayFromNameSpaceTensor(it.inputValue) - dynamicCustomOp.addOutputArgument(convertedTensor) - } - } - - OpNamespace.ArgDescriptor.ArgType.BOOL -> { - listOfArgsSortedByIndex.forEach { - dynamicCustomOp.addBArgument(it.boolValue) - } - } - - OpNamespace.ArgDescriptor.ArgType.DATA_TYPE -> { - listOfArgsSortedByIndex.forEach { - val dtype = convertNd4jDataTypeFromNameSpaceTensorDataType(it.dataTypeValue!!) - val dtypeJavaClass = Class.forName("org.nd4j.linalg.api.buffer.DataType") - dynamicCustomOp.addDArgument(dtype) - df.javaClass.declaredFields.forEach { field -> - if (!Modifier.isStatic(field.modifiers) && !Modifier.isFinal(field.modifiers) - && dtypeJavaClass.isAssignableFrom(field.type) - ) { - field.isAccessible = true - ReflectionUtils.setField(field, df, dtype) - } - } - } - } - else -> { - throw IllegalArgumentException("Illegal type") - } - - } - - //set any left over fields if they're found - setNameForFunctionFromDescriptors(listOfArgsSortedByIndex, df) - } - - - } - Op.Type.SCALAR -> { - applied.second.argDescriptorList.forEach { argDescriptor -> - val field = ReflectionUtils.findField(df.javaClass, argDescriptor.name) - if (field != null) { - field.isAccessible = true - when (argDescriptor.name) { - "x", "y", "z" -> { - val tensorName = opMappingProcess.tensorMappingRules().filter { mappingRule -> - mappingRule.mappingNamesToPerform() - .containsKey(argDescriptor.name) - } - .map { rule -> rule.mappingNamesToPerform()[argDescriptor.name] }.first()!! - val createdNDArray = mappingContext.tensorInputFor(tensorName).toNd4jNDArray() - ReflectionUtils.setField(field, df, createdNDArray) - } - else -> { - } - } - - } else { - if (argDescriptor.argType in listOf( - OpNamespace.ArgDescriptor.ArgType.INT64, - OpNamespace.ArgDescriptor.ArgType.DOUBLE, OpNamespace.ArgDescriptor.ArgType.INT32, - OpNamespace.ArgDescriptor.ArgType.FLOAT - ) - ) { - val scalarField = ReflectionUtils.findField(df.javaClass, "scalarValue") - scalarField.isAccessible = true - //access the first input (should have been set) and make sure the scalar type is the - //the same - val firstValue = sd.variables().first() - val dtype = firstValue.dataType() - when (argDescriptor.argType) { - OpNamespace.ArgDescriptor.ArgType.DOUBLE -> { - val nd4jScalarValue = Nd4j.scalar(argDescriptor.doubleValue).castTo(dtype) - ReflectionUtils.setField(scalarField, df, nd4jScalarValue) - - } - OpNamespace.ArgDescriptor.ArgType.FLOAT -> { - val nd4jScalarValue = Nd4j.scalar(argDescriptor.floatValue).castTo(dtype) - ReflectionUtils.setField(scalarField, df, nd4jScalarValue) - - } - OpNamespace.ArgDescriptor.ArgType.INT32 -> { - val nd4jScalarValue = Nd4j.scalar(argDescriptor.int32Value).castTo(dtype) - ReflectionUtils.setField(scalarField, df, nd4jScalarValue) - - } - OpNamespace.ArgDescriptor.ArgType.INT64 -> { - val nd4jScalarValue = Nd4j.scalar(argDescriptor.int64Value).castTo(dtype) - ReflectionUtils.setField(scalarField, df, nd4jScalarValue) - - } - } - } - } - } - } - else -> { - var hasDimensions = false - applied.second.argDescriptorList.forEach { argDescriptor -> - if (argDescriptor.name == "dimensions") - hasDimensions = true - val field = ReflectionUtils.findField(df.javaClass, argDescriptor.name) - if (field != null) { - field.isAccessible = true - when (argDescriptor.name) { - "x", "y", "z" -> { - val tensorName = opMappingProcess.tensorMappingRules().filter { mappingRule -> - mappingRule.mappingNamesToPerform() - .containsKey(argDescriptor.name) - } - .map { rule -> rule.mappingNamesToPerform()[argDescriptor.name] }.first()!! - val createdNDArray = mappingContext.tensorInputFor(tensorName).toNd4jNDArray() - ReflectionUtils.setField(field, df, createdNDArray) - } - "keepDims" -> ReflectionUtils.setField(field, df, argDescriptor.boolValue) - else -> { - } - } - } - } - - if (hasDimensions) { - //dimensions sorted by index - val dimArgs = - applied.second.argDescriptorList.filter { argDescriptor -> argDescriptor.name.contains("dimensions") } - .sortedBy { argDescriptor -> argDescriptor.argIndex } - .map { argDescriptor -> argDescriptor.int64Value.toInt() }.toIntArray() - val dimensionsField = ReflectionUtils.findField(df.javaClass, "dimensions") - val dimensionzField = ReflectionUtils.findField(df.javaClass, "dimensionz") - if (dimensionsField != null) { - dimensionsField.isAccessible = true - if (intArrayOf(0).javaClass.isAssignableFrom(dimensionsField.type)) { - ReflectionUtils.setField(dimensionsField, df, dimArgs) - } - } - - if (dimensionzField != null) { - dimensionzField.isAccessible = true - if (INDArray::class.java.isAssignableFrom(dimensionzField.type)) { - val buffer = Nd4j.createBuffer(dimArgs) - val createdArr = Nd4j.create(buffer) - ReflectionUtils.setField(dimensionzField, df, createdArr) - } - } - - } - - } - } - } -} - - -fun descriptorsForName( - name: String, - argDescriptors: Collection): List { - return argDescriptors.filter { argDescriptor -> argDescriptor.name == name }!! -} - -fun setNameForFunctionFromDescriptors(argDescriptors: Collection,func: DifferentialFunction) { - func.javaClass.declaredFields.forEach { field -> - if(hasArgDescriptorWithNameAndType(argDescriptors,field.name)) { - val descriptors = descriptorsForName(field.name,argDescriptors) - descriptors.forEach { descriptor -> - when(descriptor.argType) { - OpNamespace.ArgDescriptor.ArgType.BOOL -> { - if(Boolean.javaClass.isAssignableFrom(field.type) || Boolean::class.javaPrimitiveType!!.isAssignableFrom(field.type)) { - field.isAccessible = true - ReflectionUtils.setField(field,func,descriptor.boolValue) - } - } - OpNamespace.ArgDescriptor.ArgType.INT64, OpNamespace.ArgDescriptor.ArgType.INT32 -> { - if(Int.javaClass.isAssignableFrom(field.type) || Int::class.javaPrimitiveType!!.isAssignableFrom(field.type)) { - field.isAccessible = true - ReflectionUtils.setField(field,func,descriptor.int64Value.toInt()) - } - - if(Long.javaClass.isAssignableFrom(field.type) || Long::class.javaPrimitiveType!!.isAssignableFrom(field.type)) { - field.isAccessible = true - ReflectionUtils.setField(field,func,descriptor.int64Value) - } - - if(DataType::javaClass.javaClass.isAssignableFrom(field.type)) { - field.isAccessible = true - ReflectionUtils.setField(field,func, DataType.fromInt(descriptor.int64Value.toInt())) - } - - } - OpNamespace.ArgDescriptor.ArgType.FLOAT, OpNamespace.ArgDescriptor.ArgType.DOUBLE -> { - if(Float.javaClass.isAssignableFrom(field.type) || Float::class.javaPrimitiveType!!.isAssignableFrom(field.type)) { - field.isAccessible = true - ReflectionUtils.setField(field,func,descriptor.doubleValue.toFloat()) - } - - if(Double.javaClass.isAssignableFrom(field.type) || Double::class.javaPrimitiveType!!.isAssignableFrom(field.type)) { - field.isAccessible = true - ReflectionUtils.setField(field,func,descriptor.doubleValue) - } - } - - OpNamespace.ArgDescriptor.ArgType.DATA_TYPE -> { - if(DataType::javaClass.javaClass.isAssignableFrom(field.type)) { - field.isAccessible = true - ReflectionUtils.setField(field,func, convertNd4jDataTypeFromNameSpaceTensorDataType(descriptor.dataTypeValue)) - } - } - - } - - } - - } - } - -} -fun hasArgDescriptorWithNameAndType(argDescriptors: Collection, name: String): Boolean { - return argDescriptors.map { input -> input.name}.contains(name) -} - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/IRMappingFunctions.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/IRMappingFunctions.kt deleted file mode 100644 index 6f924a7f2..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/IRMappingFunctions.kt +++ /dev/null @@ -1,2025 +0,0 @@ -package org.nd4j.codegen.ir - -import org.nd4j.ir.MapperNamespace -import org.nd4j.ir.OpNamespace.* -import org.nd4j.ir.TensorNamespace -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.shade.protobuf.GeneratedMessageV3 -import org.nd4j.shade.protobuf.ProtocolMessageEnum -import kotlin.collections.ArrayList - -abstract class BaseAttributeExtractionRule< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>( - name: String, - mappingNamesToPerform: Map, - transformerArgs: Map>): - AttributeMappingRule - where DATA_TYPE: ProtocolMessageEnum { - - protected var opDescriptor: OpDescriptor? = null - protected val mappingNamesToPerform = mappingNamesToPerform - protected var frameworkName: String? = null - protected var inputFrameworkOpName: String? = null - protected val transformerArgs = transformerArgs - protected val name = name - protected var inputOpDefTypes: Map? = null - - - override fun initWithMappingProcess(mappingProcess: MappingProcess) { - this.opDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - this.frameworkName = mappingProcess.inputFramework() - this.inputFrameworkOpName = mappingProcess.inputFrameworkOpName() - this.inputOpDefTypes = mappingProcess.inputOpDefValueTypes() - } - - override fun mappingNamesToPerform(): Map { - return mappingNamesToPerform - } - - override fun name(): String { - return name - } - - override fun mappingTransformerArgs(): Map> { - return transformerArgs - } - - - abstract fun createIRAttribute(name: String, attrDef: ATTR_DEF, attributeValueType: ATTR_VALUE_TYPE): IRAttribute - - - override fun serialize(): MapperNamespace.MappingRule { - val builder = MapperNamespace.MappingRule.newBuilder() - builder.ruleName = name() - builder.functionName = name() - builder.ruleType = "attribute" - val descriptorList = opDescriptor!!.argDescriptorList - println("Serializing op ${opDescriptor!!.name}") - for ((k, v) in transformerArgs) { - v.forEach { descriptor -> - when (descriptor.argType) { - ArgDescriptor.ArgType.STRING -> builder.addInputStringAttrName(descriptor.name) - ArgDescriptor.ArgType.BOOL -> builder.addInputBooleanName(descriptor.name) - ArgDescriptor.ArgType.DOUBLE, ArgDescriptor.ArgType.FLOAT -> builder.addInputFloatName(descriptor.name) - ArgDescriptor.ArgType.INT32, ArgDescriptor.ArgType.INT64 -> builder.addInputIntName(descriptor.name) - ArgDescriptor.ArgType.INPUT_TENSOR -> builder.addInputTensorName(descriptor.name) - } - } - - } - - /** - * TODO: metadata (perhaps looking up from each framework for each attribute) - * what each named type is. - */ - mappingNamesToPerform.forEach { outputName, inputName -> - val descriptorForName = opDescriptor!!.argDescriptorList.first { descriptor -> descriptor.name == outputName } - builder.putInputToOutput(outputName,inputName) - when(descriptorForName.argType) { - ArgDescriptor.ArgType.BOOL -> { builder.addOutputBooleanName(outputName)} - ArgDescriptor.ArgType.INT64 -> {builder.addOutputIntName(outputName)} - ArgDescriptor.ArgType.DOUBLE -> {builder.addOutputDoubleName(outputName)} - ArgDescriptor.ArgType.DATA_TYPE -> builder.addOutputDataTypeName(outputName) - ArgDescriptor.ArgType.OUTPUT_TENSOR -> builder.addOutputTensorName(outputName) - ArgDescriptor.ArgType.STRING -> builder.addOutputStringAttrName(outputName) - } - - //not all associated outputs will have inputs - if(inputOpDefTypes!!.containsKey(inputName)) { - when(inputOpDefTypes!![inputName]!!) { - AttributeValueType.FLOAT -> builder.addInputFloatName(inputName) - AttributeValueType.INT -> builder.addInputIntName(inputName) - AttributeValueType.BOOL -> builder.addInputBooleanName(inputName) - AttributeValueType.STRING -> builder.addInputStringAttrName(inputName) - AttributeValueType.DATA_TYPE -> builder.addInputDataTypeName(inputName) - AttributeValueType.TENSOR -> builder.addInputTensorName(inputName) - } - - } - - - - } - - - return builder.build() - } - - override fun argDescriptorTypesForOutputName( - name: String, mappingProcess: - MappingProcess): List { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - val names = nd4jOpDescriptor.argDescriptorList.map { input -> input.name } - if(!names.contains(name)) { - throw java.lang.IllegalArgumentException("Unable to find name $name for op $nd4jOpDescriptor.name") - } - - return nd4jOpDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.name == name }.map { argDescriptor -> argDescriptor.argType} - } -} - -abstract class StringEqualsAdapterRule< - GRAPH_DEF :GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3,ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>( - mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()): - BaseAttributeExtractionRule - (name = "stringequals", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs) - where DATA_TYPE: ProtocolMessageEnum { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.STRING - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.BOOL) || argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - - for((k, v) in mappingNamesToPerform()) { - val descriptorForName = transformerArgs[k] - val argDescriptorTypeList = mappingCtx.argDescriptorTypeForName(k) - - val compString = descriptorForName!![0].stringValue - val testValue = mappingCtx.irAttributeValueForNode(v).stringValue() - argDescriptorTypeList.forEach { argDescriptorType -> - val descriptorBuilder = ArgDescriptor.newBuilder() - descriptorBuilder.name = v - descriptorBuilder.argType = argDescriptorType - descriptorBuilder.argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = argDescriptorType - ) - - when(argDescriptorType) { - ArgDescriptor.ArgType.BOOL -> { - descriptorBuilder.boolValue = testValue == compString - } - - ArgDescriptor.ArgType.INT64 -> { - descriptorBuilder.int64Value = if (testValue == compString) 1 else 0 - - } - } - - ret.add(descriptorBuilder.build()) - } - - - } - return ret - } -} - - -abstract class StringContainsAdapterRule< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3,ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>( - mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()): - BaseAttributeExtractionRule - (name = "stringcontains", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs) - where DATA_TYPE: ProtocolMessageEnum { - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.STRING - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.BOOL) || argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - - for((k, v) in mappingNamesToPerform()) { - val argDescriptorTypeList = mappingCtx.argDescriptorTypeForName(k) - val descriptorForName = transformerArgs[k] - val compString = descriptorForName!![0].stringValue - val testValue = mappingCtx.irAttributeValueForNode(v).stringValue() - argDescriptorTypeList.forEach { argDescriptorType -> - val descriptorBuilder = ArgDescriptor.newBuilder() - descriptorBuilder.name = k - descriptorBuilder.argType = argDescriptorType - descriptorBuilder.argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = argDescriptorType - ) - - when(argDescriptorType) { - ArgDescriptor.ArgType.BOOL -> { - descriptorBuilder.boolValue = compString.contains(testValue) - } - - ArgDescriptor.ArgType.INT64 -> { - descriptorBuilder.int64Value = if (compString.contains(testValue)) 1 else 0 - - } - - } - ret.add(descriptorBuilder.build()) - } - - - } - return ret - } -} - -abstract class StringNotEqualsAdapterRule< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3,ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>( - mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()): - BaseAttributeExtractionRule - (name = "stringnotequalsadapterrule", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs) - where DATA_TYPE: ProtocolMessageEnum { - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for((k, v) in mappingNamesToPerform()) { - val descriptorForName = transformerArgs[k] - val compString = descriptorForName!![0].stringValue - val testValue = mappingCtx.irAttributeValueForNode(v).stringValue() - val argDescriptorTypeList = mappingCtx.argDescriptorTypeForName(k) - argDescriptorTypeList.forEach { argDescriptorType -> - when(argDescriptorType) { - ArgDescriptor.ArgType.INT64 -> { - ret.add(ArgDescriptor { - name = k - argType = argDescriptorType - int64Value = if(testValue != compString) 1 else 0 - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - - }) - } - - ArgDescriptor.ArgType.BOOL -> { - ret.add(ArgDescriptor { - name = k - argType = argDescriptorType - boolValue = testValue != compString - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.BOOL - ) - - }) - } - } - } - - - } - return ret - } - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.STRING - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.BOOL) || - argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } -} - -abstract class SizeThresholdIntArrayIntIndexRule< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>(mappingNamesToPerform: Map, - transformerArgs: Map>): - BaseAttributeExtractionRule - (name = "sizethresholdarrayint", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) where DATA_TYPE: ProtocolMessageEnum { - - - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - - for((k, v) in mappingNamesToPerform()) { - val descriptorForName = transformerArgs[k] - val inputArr = mappingCtx.irAttributeValueForNode(v).listIntValue() - val index = descriptorForName!![0].int32Value - val sizeThreshold = descriptorForName!![1].int64Value - val fallbackIndex = descriptorForName!![2].stringValue - val descriptorBuilder = ArgDescriptor.newBuilder() - descriptorBuilder.name = v - descriptorBuilder.argType = ArgDescriptor.ArgType.INT64 - if(inputArr.size < sizeThreshold) { - descriptorBuilder.int64Value = inputArr[fallbackIndex.toInt()] - } else { - descriptorBuilder.int64Value = inputArr[index] - } - - descriptorBuilder.argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - - - ret.add(descriptorBuilder.build()) - - } - return ret - } - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.INT || - argDescriptorType == AttributeValueType.STRING - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } -} - -abstract class ConditionalFieldValueIntIndexArrayRule< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>(mappingNamesToPerform: Map, - transformerArgs: Map>): - BaseAttributeExtractionRule - (name = "conditionalfieldvalueintindex", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) - where DATA_TYPE: ProtocolMessageEnum { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.STRING || argDescriptorType ==AttributeValueType.INT - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - - for((k, v) in mappingNamesToPerform()) { - val listOfArgs = transformerArgs[k] - val inputArr = mappingCtx.irAttributeValueForNode(listOfArgs!![3].stringValue).listIntValue() - val trueIndex = listOfArgs!![1].int32Value - val falseIndex = listOfArgs!![2].int32Value - val targetValueToTest = listOfArgs!![0].stringValue - val testValue = mappingCtx.irAttributeValueForNode(v).stringValue() - val intValueToSet = if (testValue == targetValueToTest) inputArr[trueIndex] else inputArr[falseIndex] - ret.add(ArgDescriptor { - name = v - int64Value = intValueToSet - argType = ArgDescriptor.ArgType.INT64 - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - }) - - } - return ret - } -} - - -abstract class ConditionalFieldValueIntIndexNDArrayRule< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>(mappingNamesToPerform: Map, - transformerArgs: Map>): - BaseAttributeExtractionRule - (name = "conditionalfieldvalueintindexndarray", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) - where DATA_TYPE: ProtocolMessageEnum { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.TENSOR || argDescriptorType == AttributeValueType.STRING - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for((k, v) in mappingNamesToPerform()) { - val listOfArgs = transformerArgs[k] - val inputArr = mappingCtx.tensorInputFor(listOfArgs!![3].stringValue).toNd4jNDArray().ravel() - val trueIndex = listOfArgs!![1].int32Value - val falseIndex = listOfArgs!![2].int32Value - val targetValueToTest = listOfArgs!![0].stringValue - val testValue = mappingCtx.irAttributeValueForNode(v).stringValue() - val intValueToSet = if (testValue == targetValueToTest) inputArr.getInt(trueIndex) else inputArr.getInt(falseIndex) - ret.add(ArgDescriptor { - name = v - int64Value = intValueToSet.toLong() - argType = ArgDescriptor.ArgType.INT64 - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - }) - - } - return ret - } -} - - -/** - * Need to implement tensor size extraction value at index - */ - - -abstract class NDArraySizeAtRule< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>(mappingNamesToPerform: Map, - transformerArgs: Map>): - BaseAttributeExtractionRule - (name = "ndarraysizeat", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) - where DATA_TYPE: ProtocolMessageEnum { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.TENSOR - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - mappingNamesToPerform().forEach { (k, v) -> - val transformArgsForAttribute = transformerArgs[k] - //note that this finds a value for a named tensor within either the graph or the node - //some frameworks may have a value node with a value attribute - //others may have the actual tensor value - val inputArr = mappingCtx.tensorInputFor(v) - val sizeIndex = transformArgsForAttribute!![0].int32Value - val sizeAt = inputArr.shape()[sizeIndex] - val argDescriptor = ArgDescriptor { - name = v - argType = ArgDescriptor.ArgType.INT64 - int64Value = sizeAt - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - } - ret.add(argDescriptor) - } - - return ret - } -} - - -abstract class NDArrayExtractScalarValue< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE>(mappingNamesToPerform: Map, - transformerArgs: Map>): - BaseAttributeExtractionRule - (name = "ndarrayextractscalarvalue", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) - where DATA_TYPE: ProtocolMessageEnum { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.TENSOR - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INPUT_TENSOR) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - mappingNamesToPerform().forEach { (k, v) -> - val indexValueToAbstract = transformerArgs[k]!![0].int64Value - val ndarrayInput = mappingCtx.tensorInputFor(v).toNd4jNDArray() - val argDescriptor = ArgDescriptor { - name = k - argType = ArgDescriptor.ArgType.INPUT_TENSOR - inputValue = nameSpaceTensorFromNDarray(Nd4j.scalar(ndarrayInput.getDouble(indexValueToAbstract))) - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INPUT_TENSOR - ) - } - ret.add(argDescriptor) - } - - return ret - } -} - - -/** - * Need to implement tensor size extraction value at index - */ - - -abstract class ValueMapping< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE: ProtocolMessageEnum>(mappingNamesToPerform: Map, - transformerArgs: Map>): - BaseAttributeExtractionRule - (name = "valuemapping", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType != AttributeValueType.TENSOR - } - - override fun outputsType(argDescriptorType: List): Boolean { - return !argDescriptorType.containsAll(listOf(ArgDescriptor.ArgType.INPUT_TENSOR, - ArgDescriptor.ArgType.OUTPUT_TENSOR,ArgDescriptor.ArgType.DATA_TYPE)) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for((k, v) in mappingNamesToPerform()) { - val descriptorBuilder = ArgDescriptor.newBuilder() - descriptorBuilder.name = k - val op = nd4jOpDescriptors.findOp(mappingCtx.nd4jOpName()) - val irAttribute = mappingCtx.irAttributeValueForNode(v) - when(irAttribute.attributeValueType()) { - AttributeValueType.INT -> { - descriptorBuilder.argType = ArgDescriptor.ArgType.INT64 - descriptorBuilder.int64Value = irAttribute.intValue() - descriptorBuilder.argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - - } - - AttributeValueType.FLOAT -> { - descriptorBuilder.argType = ArgDescriptor.ArgType.DOUBLE - //DO NOT REMOVE work around for numerical underflow that happens at the JVM level, this does a safe cast allowing us to get the real value out - val realValue = Nd4j.scalar(irAttribute.floatValue()).castTo(DataType.DOUBLE) - descriptorBuilder.doubleValue = realValue.getDouble(0) - descriptorBuilder.argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.DOUBLE - ) - - } - - AttributeValueType.BOOL -> { - descriptorBuilder.argType = ArgDescriptor.ArgType.BOOL - descriptorBuilder.boolValue = irAttribute.boolValue() - descriptorBuilder.argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.BOOL - ) - } - - AttributeValueType.STRING -> { - descriptorBuilder.argType = ArgDescriptor.ArgType.STRING - descriptorBuilder.stringValue = irAttribute.stringValue() - descriptorBuilder.argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.STRING - ) - } - - AttributeValueType.DATA_TYPE -> { - descriptorBuilder.argType = ArgDescriptor.ArgType.DATA_TYPE - descriptorBuilder.dataTypeValue = irAttribute.dataTataTypeValue().nameSpaceDataType() - descriptorBuilder.argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.DATA_TYPE - ) - } - - else -> { - throw IllegalArgumentException("Unable to map value $k. Please use different rule for list values and tensors.") - } - } - - - ret.add(descriptorBuilder.build()) - - } - return ret - } -} - - -abstract class NumberToBoolean< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE: ProtocolMessageEnum>(mappingNamesToPerform: Map, - transformerArgs: Map>): - BaseAttributeExtractionRule - (name = "booleantonumber", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.INT || argDescriptorType == AttributeValueType.FLOAT - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.BOOL) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - - for ((k, v) in mappingNamesToPerform()) { - val descriptorBuilder = ArgDescriptor.newBuilder() - descriptorBuilder.name = k - val irAttribute = mappingCtx.irAttributeValueForNode(v) - val targetIdx = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.BOOL - ) - - if(targetIdx < 0) { - throw java.lang.IllegalArgumentException("Output attribute $k not found with boolean type for op name ${mappingCtx.nd4jOpName()} and input op name ${mappingCtx.opName()}") - } - - - descriptorBuilder.argIndex = targetIdx - descriptorBuilder.argType = ArgDescriptor.ArgType.BOOL - - - when(irAttribute.attributeValueType()) { - AttributeValueType.FLOAT -> { - descriptorBuilder.boolValue = irAttribute.floatValue() > 0 - } - AttributeValueType.INT -> { - descriptorBuilder.boolValue = irAttribute.intValue() > 0 - } - } - - ret.add(descriptorBuilder.build()) - } - return ret - } -} - - -/** - * Change a boolean to an int - * or an int or double to a boolean - */ -abstract class InvertBooleanNumber< - GRAPH_DEF: GeneratedMessageV3, - OP_DEF_TYPE: GeneratedMessageV3, - NODE_TYPE: GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE: ProtocolMessageEnum>(mappingNamesToPerform: Map, - transformerArgs: Map>): - BaseAttributeExtractionRule - (name = "invertbooleannumber", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.INT || argDescriptorType == AttributeValueType.BOOL - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) || - argDescriptorType.contains(ArgDescriptor.ArgType.DOUBLE) || - argDescriptorType.contains(ArgDescriptor.ArgType.BOOL) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - - for ((k, v) in mappingNamesToPerform()) { - val descriptorBuilder = ArgDescriptor.newBuilder() - descriptorBuilder.name = k - val irAttribute = mappingCtx.irAttributeValueForNode(v) - when(irAttribute.attributeValueType()) { - AttributeValueType.INT -> { - val targetIdx = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.BOOL - ) - - descriptorBuilder.argType = ArgDescriptor.ArgType.BOOL - descriptorBuilder.boolValue = irAttribute.intValue() > 0 - descriptorBuilder.argIndex = targetIdx - ret.add(descriptorBuilder.build()) - - } - AttributeValueType.FLOAT -> { - val targetIdx = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.BOOL - ) - - descriptorBuilder.argType = ArgDescriptor.ArgType.BOOL - descriptorBuilder.boolValue = irAttribute.floatValue() > 0 - descriptorBuilder.argIndex = targetIdx - ret.add(descriptorBuilder.build()) - - } - else -> { - listOf(ArgDescriptor.ArgType.INT64, ArgDescriptor.ArgType.DOUBLE) - .forEach { argDescriptorType -> - val targetIdx = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = argDescriptorType - ) - - if (targetIdx >= 0) { - when (argDescriptorType) { - ArgDescriptor.ArgType.DOUBLE -> { - descriptorBuilder.argType = argDescriptorType - descriptorBuilder.doubleValue = if (irAttribute.boolValue()) 1.0 else 0.0 - descriptorBuilder.argIndex = targetIdx - } - ArgDescriptor.ArgType.INT64 -> { - descriptorBuilder.argType = argDescriptorType - descriptorBuilder.int64Value = if (irAttribute.boolValue()) 1 else 0 - descriptorBuilder.argIndex = targetIdx - } - - else -> { - throw IllegalArgumentException("Illegal type passed in $argDescriptorType") - } - } - - ret.add(descriptorBuilder.build()) - - } - - - } - } - } - - - - } - return ret - } -} - - -abstract class StringToInt< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - (name = "stringtoindex", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.STRING - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - - for ((k, v) in mappingNamesToPerform()) { - val listOfValues = (transformerArgs[k] ?: error("Unable to map value $v to a type string for op name ${mappingCtx.nd4jOpName()} and input op name ${mappingCtx.opName()}")).map { argDescriptor -> argDescriptor.stringValue } - val stringValIndex = mappingCtx.irAttributeValueForNode(v).stringValue() - val argDescriptor = ArgDescriptor { - name = k - argType = ArgDescriptor.ArgType.INT64 - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - int64Value = listOfValues.indexOf(stringValIndex).toLong() - } - - ret.add(argDescriptor) - - } - - return ret - } -} - - - - -abstract class MapStringToInt< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - (name = "mapstringtoindex", mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.LIST_STRING - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - val indexOfValue = transformerArgs["index"]!![0].int64Value - for ((k, v) in mappingNamesToPerform()) { - - val stringVal = mappingCtx.irAttributeValueForNode(v).listStringValue()[indexOfValue.toInt()] - val activationInt = (transformerArgs[k] ?: error("Unable to map value $v to a type string for op name ${mappingCtx.nd4jOpName()} and input op name ${mappingCtx.opName()}")) - .filter {argDescriptor -> argDescriptor.name == stringVal } - .map { argDescriptor -> argDescriptor.int64Value }.first() - val argDescriptor = ArgDescriptor { - name = k - argType = ArgDescriptor.ArgType.INT64 - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - int64Value = activationInt - } - - ret.add(argDescriptor) - - } - - return ret - } -} - - -abstract class ListAttributeValueLookupToIndex< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "listattributevaluelookuptoindex", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.LIST_FLOAT || - argDescriptorType == AttributeValueType.LIST_INT || - argDescriptorType == AttributeValueType.LIST_STRING || - argDescriptorType == AttributeValueType.LIST_TENSOR || - argDescriptorType == AttributeValueType.LIST_BOOL || - argDescriptorType == AttributeValueType.INT - } - - override fun outputsType(argDescriptorType: List): Boolean { - return !argDescriptorType.contains(ArgDescriptor.ArgType.OUTPUT_TENSOR) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val index = (transformerArgs[k] ?: error(""))[0]!!.int64Value - val listOfValues = mappingCtx.irAttributeValueForNode(v) - when (listOfValues.attributeValueType()) { - AttributeValueType.LIST_FLOAT -> { - val listFloat = listOfValues.listFloatValue() - val argDescriptor = ArgDescriptor { - name = k - doubleValue = listFloat[index.toInt()].toDouble() - argType = ArgDescriptor.ArgType.DOUBLE - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.DOUBLE - ) - } - - ret.add(argDescriptor) - } - AttributeValueType.LIST_INT -> { - val listInt = listOfValues.listIntValue() - val argDescriptor = ArgDescriptor { - name = k - int64Value = listInt[index.toInt()] - argType = ArgDescriptor.ArgType.INT64 - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - } - - ret.add(argDescriptor) - } - - AttributeValueType.LIST_STRING -> { - val listString = listOfValues.listStringValue() - val argDescriptor = ArgDescriptor { - name = k - stringValue = listString[index.toInt()] - argType = ArgDescriptor.ArgType.STRING - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.STRING - ) - } - - ret.add(argDescriptor) - } - - AttributeValueType.LIST_TENSOR -> { - val listTensor = listOfValues.listTensorValue() - val argDescriptor = ArgDescriptor { - name = k - inputValue = listTensor[index.toInt()].toArgTensor() - argType = ArgDescriptor.ArgType.INPUT_TENSOR - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INPUT_TENSOR - ) - } - - ret.add(argDescriptor) - } - - AttributeValueType.LIST_BOOL -> { - val listBool = listOfValues.listBoolValue() - val argDescriptor = ArgDescriptor { - name = k - boolValue = listBool[index.toInt()] - argType = ArgDescriptor.ArgType.BOOL - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.BOOL - ) - } - - ret.add(argDescriptor) - } - - } - - - } - - return ret - } -} - - -abstract class StringAttributeToNDArray< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "convertinputstringtondarray", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.STRING - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INPUT_TENSOR) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val irAttribute = mappingCtx.irAttributeValueForNode(v) - when (irAttribute.attributeValueType()) { - AttributeValueType.STRING -> { - val listArr = irAttribute.stringValue() - val ndarray = Nd4j.create(listArr) - ret.add(ArgDescriptor { - argType = ArgDescriptor.ArgType.INPUT_TENSOR - name = k - inputValue = nameSpaceTensorFromNDarray(ndarray) - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INPUT_TENSOR - ) - }) - } - } - - } - - return ret - } -} - - - - -abstract class AttributeNumberListNDArray< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "convertinputnumberlisttondarray", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.LIST_FLOAT || - argDescriptorType == AttributeValueType.LIST_INT - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INPUT_TENSOR) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val irAttribute = mappingCtx.irAttributeValueForNode(v) - when (irAttribute.attributeValueType()) { - AttributeValueType.LIST_FLOAT -> { - val listArr = irAttribute.listFloatValue().toFloatArray() - val ndarray = Nd4j.create(listArr) - ret.add(ArgDescriptor { - argType = ArgDescriptor.ArgType.INPUT_TENSOR - name = k - inputValue = nameSpaceTensorFromNDarray(ndarray) - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.DOUBLE - ) - }) - } - - AttributeValueType.LIST_INT -> { - val intArr = irAttribute.listIntValue().toLongArray() - val strides = Nd4j.getStrides(1, 4).toList().map { it.toLong() }.toLongArray() - val ndarray = - Nd4j.create(intArr, longArrayOf(1, intArr.size.toLong()), strides, 'c', DataType.INT64) - ret.add(ArgDescriptor { - argType = ArgDescriptor.ArgType.INPUT_TENSOR - name = k - inputValue = nameSpaceTensorFromNDarray(ndarray) - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - }) - } - - } - - } - - return ret - } -} - -abstract class ListNumberToListNumber< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "listnumbertolistnumber", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.INT || - argDescriptorType == AttributeValueType.FLOAT || - argDescriptorType == AttributeValueType.LIST_INT || - argDescriptorType == AttributeValueType.LIST_FLOAT - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) || - argDescriptorType.contains(ArgDescriptor.ArgType.DOUBLE) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - - val irAttribute = mappingCtx.irAttributeValueForNode(v) - when (irAttribute.attributeValueType()) { - AttributeValueType.LIST_INT -> { - val baseIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - val listInts = irAttribute.listIntValue() - listInts.forEachIndexed { index, element -> - val finalName = if (index > 0) k + "$index" else k - val argDescriptor = ArgDescriptor { - name = finalName - int64Value = element - argType = ArgDescriptor.ArgType.INT64 - argIndex = baseIndex + index - } - - ret.add(argDescriptor) - } - } - AttributeValueType.LIST_FLOAT -> { - val baseIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.DOUBLE - ) - val listFloats = irAttribute.listFloatValue() - listFloats.forEachIndexed { index, element -> - val finalName = if (index > 0) k + "$index" else k - val argDescriptor = ArgDescriptor { - name = finalName - doubleValue = element.toDouble() - argType = ArgDescriptor.ArgType.DOUBLE - argIndex = baseIndex + index - } - - ret.add(argDescriptor) - } - } - } - } - - return ret - } -} - - -abstract class ListNumberToNDArray< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "listnumbertondarray", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.INT || - argDescriptorType == AttributeValueType.FLOAT || - argDescriptorType == AttributeValueType.LIST_INT || - argDescriptorType == AttributeValueType.LIST_FLOAT - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INPUT_TENSOR) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val listOfValues = mappingCtx.irAttributeValueForNode(v) - val baseIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INPUT_TENSOR - ) - - when (listOfValues.attributeValueType()) { - AttributeValueType.LIST_FLOAT -> { - val nd4jArray = Nd4j.create(listOfValues.listFloatValue().toFloatArray()) - val inputTensor = nameSpaceTensorFromNDarray(nd4jArray) - ret.add(ArgDescriptor { - name = k - inputValue = inputTensor - argType = ArgDescriptor.ArgType.INPUT_TENSOR - argIndex = baseIndex - }) - } - - AttributeValueType.LIST_INT -> { - val nd4jArray = Nd4j.create(Nd4j.createBuffer(listOfValues.listIntValue().toLongArray())) - val inputTensor = nameSpaceTensorFromNDarray(nd4jArray) - ret.add(ArgDescriptor { - name = k - inputValue = inputTensor - argType = ArgDescriptor.ArgType.INPUT_TENSOR - argIndex = baseIndex - }) - } - - } - - } - - return ret - } -} - - -abstract class NDArrayAttributeToNDArrayInput< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "ndarrayinputtondarray", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.TENSOR - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INPUT_TENSOR) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val baseIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INPUT_TENSOR - ) - val attr = mappingCtx.irAttributeValueForNode(v).tensorValue() - ret.add(ArgDescriptor { - name = k - inputValue = attr.toArgTensor() - argType = ArgDescriptor.ArgType.INPUT_TENSOR - argIndex = baseIndex - }) - - } - - - return ret - } -} - - - - - -abstract class NDArrayInputToNumericalAttribute< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "ndarrayinputtonumericalattribute", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.TENSOR - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.DOUBLE) - || argDescriptorType.contains(ArgDescriptor.ArgType.INT64) || - argDescriptorType.contains(ArgDescriptor.ArgType.FLOAT) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - val realDescriptor = nd4jOpDescriptors.findOp(mappingCtx.nd4jOpName()) - - for ((k, v) in mappingNamesToPerform()) { - val inputTensor = mappingCtx.tensorInputFor(v).toNd4jNDArray() - realDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.name == k && - argDescriptor.argType == ArgDescriptor.ArgType.INT64 && argDescriptor.name == k || argDescriptor.argType == ArgDescriptor.ArgType.DOUBLE && argDescriptor.name == k} - .forEach { argDescriptor -> - val baseIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = argDescriptor.argType - ) - for (i in 0 until 1) { - val nameToUse = if (i > 0) k + "$i" else k - when (argDescriptor.argType) { - ArgDescriptor.ArgType.DOUBLE -> { - ret.add(ArgDescriptor { - name = nameToUse - argType = ArgDescriptor.ArgType.DOUBLE - doubleValue = inputTensor.getDouble(i) - argIndex = baseIndex + i - }) - } - - ArgDescriptor.ArgType.INT64 -> { - ret.add(ArgDescriptor { - name = nameToUse - argType = ArgDescriptor.ArgType.INT64 - int64Value = inputTensor.getInt(i).toLong() - argIndex = baseIndex + i - }) - } - } - - } - } - - } - - return ret - } -} - -//source: https://github.com/eclipse/deeplearning4j/blob/63fa3c2ef3c4e5e33cdb99bb4804997b40ad4590/libnd4j/include/array/DataType.h#L39 - -/** - * Referenced from https://github.com/eclipse/deeplearning4j/blob/63fa3c2ef3c4e5e33cdb99bb4804997b40ad4590/libnd4j/include/array/DataType.h - * Used to convert ints to data types. These ints are used in certain ops where an int is taken in for expressing a data type. - */ -fun dataTypeFromInt(inputInt: Int): TensorNamespace.DataType { - when(inputInt) { - 1 -> return TensorNamespace.DataType.BOOL - 2 -> return TensorNamespace.DataType.BFLOAT16 - 3 -> return TensorNamespace.DataType.FLOAT16 - 4 -> return TensorNamespace.DataType.FLOAT - 5 -> return TensorNamespace.DataType.FLOAT - 6 -> return TensorNamespace.DataType.DOUBLE - 7 -> return TensorNamespace.DataType.INT8 - 8 -> return TensorNamespace.DataType.INT16 - 9 -> return TensorNamespace.DataType.INT32 - 10 -> return TensorNamespace.DataType.INT64 - 11 -> return TensorNamespace.DataType.UINT8 - 12 -> return TensorNamespace.DataType.UINT16 - 13 -> return TensorNamespace.DataType.UINT32 - 14 -> return TensorNamespace.DataType.UINT64 - 17 -> return TensorNamespace.DataType.BFLOAT16 - 50,51,52 -> return TensorNamespace.DataType.STRING - else -> return TensorNamespace.DataType.UNDEFINED - - } -} - -/** - * Reverse of [dataTypeFromInt] - * converts an int argument to a [TensorNamespace.DataType] - */ -fun intArgFromDataType(inputDataType: TensorNamespace.DataType): Int { - when(inputDataType) { - TensorNamespace.DataType.BOOL -> return 1 - TensorNamespace.DataType.BFLOAT16 -> return 2 - TensorNamespace.DataType.FLOAT16 -> return 3 - TensorNamespace.DataType.FLOAT -> return 4 - TensorNamespace.DataType.FLOAT -> return 5 - TensorNamespace.DataType.DOUBLE -> return 6 - TensorNamespace.DataType.INT8 -> return 7 - TensorNamespace.DataType.INT16 -> return 8 - TensorNamespace.DataType.INT32 -> return 9 - TensorNamespace.DataType.INT64 -> return 10 - TensorNamespace.DataType.UINT8 -> return 11 - TensorNamespace.DataType.UINT16 -> return 12 - TensorNamespace.DataType.UINT32 -> return 13 - TensorNamespace.DataType.UINT64 -> return 14 - TensorNamespace.DataType.BFLOAT16 -> return 17 - TensorNamespace.DataType.STRING -> return 50 - else -> return -1 - - } -} - - -abstract class DataTypeToInt< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "attributendarraytoscalarattribute", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.DATA_TYPE - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val irAttribute = mappingCtx.irAttributeValueForNode(v).dataTataTypeValue() - ret.add(ArgDescriptor { - argType = ArgDescriptor.ArgType.INT64 - name = k - int64Value = intArgFromDataType(irAttribute.nameSpaceDataType()).toLong() - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - }) - - } - - return ret - } -} - - -abstract class AttributeNDArrayToScalarAttribute< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "attributendarraytoscalarattribute", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.TENSOR - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) || - argDescriptorType.contains(ArgDescriptor.ArgType.DOUBLE) || - argDescriptorType.contains(ArgDescriptor.ArgType.INT32) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val irAttribute = mappingCtx.tensorAttributeFor(v).toNd4jNDArray() - val realDataType = argDescriptorType(k, nd4jOpDescriptors.findOp(mappingCtx.nd4jOpName())) - when(realDataType) { - ArgDescriptor.ArgType.DOUBLE -> { - ret.add(ArgDescriptor { - argType = ArgDescriptor.ArgType.DOUBLE - name = k - doubleValue = irAttribute.getDouble(0) - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.DOUBLE - ) - }) - } - - ArgDescriptor.ArgType.INT64 -> { - ret.add(ArgDescriptor { - argType = ArgDescriptor.ArgType.INT64 - name = k - int64Value = irAttribute.getInt(0).toLong() - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - }) - } - } - - } - - return ret - } -} - - -abstract class AttributeScalarNDArrayAttribute< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "attributescalarndarrayattribute", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.FLOAT || argDescriptorType == AttributeValueType.INT - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INPUT_TENSOR) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val irAttribute = mappingCtx.irAttributeValueForNode(v) - when (irAttribute.attributeValueType()) { - AttributeValueType.FLOAT -> { - ret.add(ArgDescriptor { - argType = ArgDescriptor.ArgType.INPUT_TENSOR - name = k - inputValue = nameSpaceTensorFromNDarray(Nd4j.scalar(irAttribute.floatValue()).reshape(1, 1)) - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INPUT_TENSOR - ) - }) - } - - AttributeValueType.INT -> { - ret.add(ArgDescriptor { - argType = ArgDescriptor.ArgType.INPUT_TENSOR - name = k - inputValue = nameSpaceTensorFromNDarray(Nd4j.scalar(irAttribute.intValue()).reshape(1, 1)) - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - }) - } - else -> { - throw IllegalArgumentException("Attribute $v is not a valid type. Type was ${irAttribute.attributeValueType()}") - } - - } - - } - - return ret - } -} - - -abstract class NDArrayToIntAttributeValue< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map, - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "ndarraytointattributevalue", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return argDescriptorType == AttributeValueType.TENSOR - } - - override fun outputsType(argDescriptorType: List): Boolean { - return argDescriptorType.contains(ArgDescriptor.ArgType.INT64) - } - - override fun convertAttributes(mappingCtx: MappingContext): List { - val ret = ArrayList() - for ((k, v) in mappingNamesToPerform()) { - val ndarray = mappingCtx.tensorInputFor(v).toNd4jNDArray() - val arrInts = ndarray.ravel().toIntVector() - val baseIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INT64 - ) - for (i in 0 until ndarray.length()) { - val argDescriptor = ArgDescriptor { - name = k - int64Value = arrInts[i.toInt()].toLong() - argType = ArgDescriptor.ArgType.INT64 - argIndex = (baseIndex + i).toInt() - } - - ret.add(argDescriptor) - } - } - - return ret - } -} - - -abstract class BaseNDArrayMappingRule< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, NODE_DEF_TYPE : GeneratedMessageV3, ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, TENSOR_TYPE : GeneratedMessageV3, - DATA_TYPE>( - mappingNamesToPerform: MutableMap = mutableMapOf(), - transformerArgs: Map> = emptyMap() -) : - TensorMappingRule - where DATA_TYPE : ProtocolMessageEnum { - - protected var opDescriptor: OpDescriptor? = null - protected val mappingNamesToPerform = mappingNamesToPerform - protected val transformerArgs = transformerArgs - protected var mappingProcess: MappingProcess? = - null - - - override fun initWithMappingProcess(mappingProcess: MappingProcess) { - val opDescriptorList = nd4jOpDescriptors - if (!opDescriptorList.opListList.map { it -> it.name }.contains(mappingProcess.opName())) { - throw java.lang.IllegalArgumentException("Op name ${mappingProcess.opName()} not found!") - } - opDescriptor = opDescriptorList.opListList.first { input -> - input.name == mappingProcess.opName() - } ?: error("") - this.mappingProcess = mappingProcess - } - - - operator fun set(outputAttribute: String, inputAttribute: String) { - mappingNamesToPerform[outputAttribute] = inputAttribute - } - - override fun name(): String { - return "ndarraymapping" - } - - - override fun mappingNamesToPerform(): Map { - return mappingNamesToPerform - } - - - override fun convertInput(mappingContext: MappingContext): List { - val ret = ArrayList() - val mappingsToPerform = inputArgumentMappings() - mappingsToPerform.forEach { (k, v) -> - ret.add(ArgDescriptor { - name = k - argType = ArgDescriptor.ArgType.INPUT_TENSOR - inputValue = mappingContext.tensorInputFor(v).toArgTensor() - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = k, - opDescriptorName = mappingContext.nd4jOpName(), - argDescriptorType = ArgDescriptor.ArgType.INPUT_TENSOR - ) - }) - } - - - return ret - } - - abstract fun createTensorProto(input: TENSOR_TYPE): TensorNamespace.TensorProto - - - override fun convertInputsReverse(toReverse: List): List { - for (argument in toReverse) { - require(argument.argType == ArgDescriptor.ArgType.INPUT_TENSOR) { "Type to reverse must be an input tensor." } - } - TODO("Not yet implemented") - } - - override fun inputArgumentMappings(): Map { - return mappingNamesToPerform - } - - override fun serialize(): MapperNamespace.MappingRule { - val builder = MapperNamespace.MappingRule.newBuilder() - builder.ruleName = name() - builder.functionName = name() - builder.ruleType = "tensor" - - for ((k, v) in transformerArgs) { - val descriptor = opDescriptor!!.argDescriptorList.filter { input -> input.name == k }[0] - when (descriptor.argType) { - ArgDescriptor.ArgType.BOOL -> builder.addOutputBooleanName(k) - ArgDescriptor.ArgType.INT64 -> builder.addOutputIntName(k) - ArgDescriptor.ArgType.FLOAT -> builder.addOutputFloatName(k) - ArgDescriptor.ArgType.DOUBLE -> builder.addOutputDoubleName(k) - ArgDescriptor.ArgType.INT64 -> builder.addOutputIntName(k) - ArgDescriptor.ArgType.INPUT_TENSOR -> builder.addInputTensorName(k) - ArgDescriptor.ArgType.OUTPUT_TENSOR -> builder.addOutputTensorName(k) - } - - for (associatedInput in v) { - when (associatedInput.argType) { - ArgDescriptor.ArgType.STRING -> builder.addInputStringAttrName(associatedInput.name) - ArgDescriptor.ArgType.BOOL -> builder.addInputBooleanName(associatedInput.name) - ArgDescriptor.ArgType.DOUBLE,ArgDescriptor.ArgType.FLOAT -> builder.addInputFloatName(associatedInput.name) - ArgDescriptor.ArgType.INT32, ArgDescriptor.ArgType.INT64 -> builder.addInputIntName(associatedInput.name) - ArgDescriptor.ArgType.INPUT_TENSOR -> builder.addInputTensorName(associatedInput.name) - } - } - - - } - - mappingNamesToPerform.forEach { outputName, inputName -> - builder.addInputTensorName(inputName) - builder.addOutputTensorName(outputName) - builder.putInputToOutput(outputName,inputName) - } - - - return builder.build() - } - -} - - -abstract class MultiInputIndexMappingRule< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, NODE_DEF_TYPE : GeneratedMessageV3, ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, TENSOR_TYPE : GeneratedMessageV3, - DATA_TYPE>( - mappingNamesToPerform: MutableMap = mutableMapOf(), - transformerArgs: Map> = emptyMap() -) : - TensorMappingRule - where DATA_TYPE : ProtocolMessageEnum { - - protected var opDescriptor: OpDescriptor? = null - protected val mappingNamesToPerform = mappingNamesToPerform - protected val transformerArgs = transformerArgs - protected var mappingProcess: MappingProcess? = - null - - - override fun initWithMappingProcess(mappingProcess: MappingProcess) { - val opDescriptorList = nd4jOpDescriptors - if (!opDescriptorList.opListList.map { it -> it.name }.contains(mappingProcess.opName())) { - throw java.lang.IllegalArgumentException("Op name ${mappingProcess.opName()} not found!") - } - opDescriptor = opDescriptorList.opListList.first { input -> - input.name == mappingProcess.opName() - } ?: error("") - this.mappingProcess = mappingProcess - } - - - operator fun set(outputAttribute: String, inputAttribute: String) { - mappingNamesToPerform[outputAttribute] = inputAttribute - } - - override fun name(): String { - return "multiinputindex" - } - - - override fun mappingNamesToPerform(): Map { - return mappingNamesToPerform - } - - - override fun convertInput(mappingContext: MappingContext): List { - val ret = ArrayList() - val mappingsToPerform = inputArgumentMappings() - mappingsToPerform.forEach { (k, v) -> - val relevantInputs = mappingContext.irNode().inputNamesForListOfInputValues(v) - //get the base index of the key value and use that as the offset for the array initialization - val baseIndex = mappingContext.irNode().computeAdjustedOffsetForInput(k, v,mappingsToPerform) - //note this looks up node names from the input framework perspective, so these are not variable names present - //in the outputs yet - relevantInputs.forEachIndexed {index,inputName -> - ret.add(ArgDescriptor { - name = "$k:$index" - argType = ArgDescriptor.ArgType.INPUT_TENSOR - inputValue = mappingContext.tensorInputFromInputFrameworkName(inputName).toArgTensor() - argIndex = baseIndex + index - }) - } - } - - - return ret - } - - abstract fun createTensorProto(input: TENSOR_TYPE): TensorNamespace.TensorProto - - - override fun convertInputsReverse(toReverse: List): List { - for (argument in toReverse) { - require(argument.argType == ArgDescriptor.ArgType.INPUT_TENSOR) { "Type to reverse must be an input tensor." } - } - TODO("Not yet implemented") - } - - override fun inputArgumentMappings(): Map { - return mappingNamesToPerform - } - - override fun serialize(): MapperNamespace.MappingRule { - val builder = MapperNamespace.MappingRule.newBuilder() - builder.ruleName = name() - builder.functionName = name() - for ((k, v) in transformerArgs) { - val descriptor = opDescriptor!!.argDescriptorList.filter { input -> input.name == k }[0] - when (descriptor.argType) { - ArgDescriptor.ArgType.BOOL -> builder.addOutputBooleanName(k) - ArgDescriptor.ArgType.INT64 -> builder.addOutputIntName(k) - ArgDescriptor.ArgType.FLOAT -> builder.addOutputFloatName(k) - ArgDescriptor.ArgType.DOUBLE -> builder.addOutputDoubleName(k) - ArgDescriptor.ArgType.INT64 -> builder.addOutputIntName(k) - ArgDescriptor.ArgType.INPUT_TENSOR -> builder.addInputTensorName(k) - ArgDescriptor.ArgType.OUTPUT_TENSOR -> builder.addOutputTensorName(k) - } - - for (associatedInput in v) { - when (associatedInput.argType) { - ArgDescriptor.ArgType.STRING -> builder.addInputStringAttrName(associatedInput.name) - ArgDescriptor.ArgType.BOOL -> builder.addInputBooleanName(associatedInput.name) - ArgDescriptor.ArgType.FLOAT, ArgDescriptor.ArgType.DOUBLE -> builder.addInputFloatName(associatedInput.name) - ArgDescriptor.ArgType.INT32, ArgDescriptor.ArgType.INT64 -> builder.addInputIntName(associatedInput.name) - ArgDescriptor.ArgType.INPUT_TENSOR -> builder.addInputTensorName(associatedInput.name) - } - } - - - } - - - return builder.build() - } - -} - -abstract class ArgDescriptorConstant< - GRAPH_DEF : GeneratedMessageV3, - OP_DEF_TYPE : GeneratedMessageV3, - NODE_TYPE : GeneratedMessageV3, - ATTR_DEF : GeneratedMessageV3, - ATTR_VALUE_TYPE : GeneratedMessageV3, - TENSOR_TYPE : GeneratedMessageV3, DATA_TYPE : ProtocolMessageEnum>( - mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> -) : - BaseAttributeExtractionRule - ( - name = "argdescriptorconstant", - mappingNamesToPerform = mappingNamesToPerform, - transformerArgs = transformerArgs - ) { - - override fun acceptsInputType(argDescriptorType: AttributeValueType): Boolean { - return true - } - - override fun outputsType(argDescriptorType: List): Boolean { - return true - } - - override fun convertAttributes( - mappingCtx: MappingContext - ): List { - return transformerArgs.flatMap { - it.value.map { descriptor -> - ArgDescriptor { - name = descriptor.name - argIndex = lookupIndexForArgDescriptor( - argDescriptorName = descriptor.name, - opDescriptorName = mappingCtx.nd4jOpName(), - argDescriptorType = descriptor.argType - ) - argType = descriptor.argType - boolValue = descriptor.boolValue - floatValue = descriptor.floatValue - doubleValue = descriptor.doubleValue - int32Value = descriptor.int32Value - int64Value = descriptor.int64Value - stringValue = descriptor.stringValue - inputValue = descriptor.inputValue - outputValue = descriptor.outputValue - - } - } - } - } -} diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/ImportGraph.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/ImportGraph.kt deleted file mode 100644 index 3279da649..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/ImportGraph.kt +++ /dev/null @@ -1,578 +0,0 @@ -package org.nd4j.codegen.ir - -import org.nd4j.autodiff.functions.DifferentialFunction -import org.nd4j.autodiff.samediff.SDVariable -import org.nd4j.autodiff.samediff.SameDiff -import org.nd4j.autodiff.samediff.VariableType -import org.nd4j.autodiff.samediff.internal.SameDiffOp -import org.nd4j.autodiff.samediff.internal.Variable -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.nd4j.codegen.ir.registry.OpRegistryHolder -import org.nd4j.codegen.ir.tensorflow.isControlDep -import org.nd4j.codegen.ir.tensorflow.stripControl -import org.nd4j.codegen.ir.tensorflow.stripVarSuffix -import org.nd4j.common.base.Preconditions -import org.nd4j.imports.converters.DifferentialFunctionClassHolder -import org.nd4j.imports.graphmapper.OpImportFilter -import org.nd4j.ir.OpNamespace -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ops.DynamicCustomOp -import org.nd4j.linalg.api.ops.impl.controlflow.compat.Merge -import org.nd4j.linalg.api.ops.impl.shape.Concat -import org.nd4j.shade.protobuf.GeneratedMessageV3 -import org.nd4j.shade.protobuf.ProtocolMessageEnum -import java.lang.IllegalArgumentException -import java.util.* -import kotlin.collections.ArrayList -import kotlin.collections.HashMap -import kotlin.collections.HashSet - - -class ImportGraph { - /** - * Import a Graph based on a {@link IRGraph} model from a GraphDef, with optional import overrides - * - * @param irGraph IRGraph reflecting the needed model import - * @param importOverride Optional import override for specific ops, keyed by op name - * @param opFilter Optional filter - ops to exclude/ignore - * @return Imported model - */ - fun importGraph(irGraph: IRGraph, - importOverride: Map?>?, - opFilter: OpImportFilter?, - dynamicVariables: Map = emptyMap(), - opMappingRegistry: OpMappingRegistry - ): SameDiff { - - /* - First, build an in-memory representation of the graph that allows us to build the graph incrementally - If we can build the graph incrementally, we can make sure that the added variables are set up with the correct - datatype and (once implemented) greedy shape inference - */ - val availableToAddSet = HashSet() //TODO maybe unnecessary? - val availableToAdd: Queue> = LinkedList() - val remainingNodes: MutableMap> = - HashMap() //All other nodes, not in availableToAdd - val nodeInputTo: MutableMap> = - HashMap() // For op x -> y, x is key, y is value. Note that these are OP names not VARIABLE names - val nNodes = irGraph.nodeList().size - val importInfo = irGraph.importInfoForEachNode(dynamicVariables = dynamicVariables) - //First, add any constants, placeholders, and zero-input ops - val sd = SameDiff.create() - irGraph.nodeList().forEach { node -> - val importInfoForNode = importInfo[node.nodeName()]!! - val numInputs = node.numInputs() - val nodeInputs = ArrayList() - val name = node.nodeName() - - for(inputIdx in 0 until numInputs) { - var inOpName = stripVarSuffix(stripControl(node.inputAt(inputIdx))) - nodeInputs.add(inOpName) - if (!nodeInputTo.containsKey(inOpName)) { - nodeInputTo[inOpName!!] = ArrayList() - } - - nodeInputTo[inOpName]!!.add(name) - } - - val inputs = importInfoForNode.second.argDescriptorList.filter { input -> input.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - if(numInputs < inputs.size) { - for(i in numInputs until inputs.size) { - val newName = name + "-" + inputs[i].name - nodeInputTo[newName!!] = ArrayList() - nodeInputTo[newName]!!.add(name) - sd.constant(newName, ndarrayFromNameSpaceTensor(inputs[i].inputValue)) - } - - } - - - } - - for (i in 0 until nNodes) { - val nd = irGraph.nodeList()[i] - - val op = nd.opName() - val numInputs = nd.numInputs() - val name = nd.nodeName() - Preconditions.checkState(name.isNotEmpty(), "Node name was empty!") - if (irGraph.isConstantOpName(op)|| numInputs == 0) { - availableToAdd.add(nd) - availableToAddSet.add(name) - } else { - remainingNodes[name] = nd - - for (inputIdx in 0 until numInputs) { - var inOpName = stripControl(nd.inputAt(inputIdx)) - if (!nodeInputTo.containsKey(inOpName)) { - nodeInputTo[inOpName!!] = ArrayList() - } - nodeInputTo[inOpName]!!.add(name) - } - - } - } - - - val mergeOpsPostProcess: MutableMap = HashMap() - val defaultRunner = - DefaultImportRunner() - //Go through ops in order, and add to the graph - val constControlDeps: MutableMap> = HashMap() //Key: constant name. Value: control dependencies - while (!availableToAdd.isEmpty()) { - val nd = availableToAdd.remove() - val name = nd.nodeName() - val opName = nd.opName() - val importInfoForNode = importInfo[name] - - availableToAddSet.remove(name) - println("Adding operation to graph: $opName (name=$name)") - var skipCase = false - val rawAttrMap = HashMap() - nd.attributeMap().forEach { (name, def) -> - rawAttrMap[name] = def.internalAttributeValue() - } - - - if (opFilter != null && opFilter.skipOp( - nd.internalValue(), - sd,rawAttrMap, irGraph.internalValue())) { - println("Skipping op $name of type $opName due to op filter") - //Don't continue at this point - we still need to process what this feeds into... - skipCase = true - } else { - if (importOverride == null || !importOverride.containsKey(name)) { - //Standard case - //note, ordering matters here for onnx - if (irGraph.nodeIsPlaceHolder(nd.nodeName())) { - var shape = irGraph.shapeOfInput(nd.nodeName()) - - - val dt = irGraph.dataTypeForVariable(nd.nodeName()).nd4jDataType() - if(shape != null) - sd.placeHolder(name, dt, *shape) - else - sd.placeHolder(name, dt) - } - else if (irGraph.isConstant(opName)) { - //Get array, create a constant - val tfTensor = nd.getAttribute("value").tensorValue() - val arr = tfTensor.toNd4jNDArray() - sd.constant(name, arr) - val inputCount = nd.numInputs() - if (inputCount > 0) { - //Very likely control dependency. i.e., "we must execute op X before the constant is really available to be used" - val l: MutableList = java.util.ArrayList(inputCount) - for (i in 0 until inputCount) { - val n = nd.inputAt(i) - check(isControlDep(n)) { "Found non-control dependency input \"$n\" for constant \"$name\"" } - val n2 = stripControl(n) - l.add(n2) - } - constControlDeps[name] = l - } - } else if(opName.equals("Variable") || opName.equals("VariableV2")) { - var shape = irGraph.shapeOfInput(nd.nodeName()) - - - val dt = irGraph.dataTypeForVariable(nd.nodeName()).nd4jDataType() - if(shape != null) - sd.`var`(name, dt, *shape) - else - sd.`var`(name, dt,-1) - } - else { - /* - Normal ops. Process in the following order: - 1. Create the op instance - 2. Add op to graph - 3. Import from TF (to set attributes) - 4. Calculate output dtypes - 5. Create and add output variables to graph - - Note: one constraint on this order is that some ops import modify the graph structure. - Notable example: concat op - it removes the axis op and converts the value to an iArg - https://github.com/eclipse/deeplearning4j/issues/8285 - */ - - val opMappingProcess = OpRegistryHolder.lookupOpMappingProcess< - GRAPH_TYPE, - NODE_TYPE, - OP_DEF_TYPE, - TENSOR_TYPE, - DATA_TYPE, - ATTR_DEF_TYPE, - ATTR_VALUE_TYPE>( - inputFrameworkOpName = opName, - inputFrameworkName = irGraph.frameworkName() - ) - - - val nd4jOpName = opMappingRegistry.lookupOpMappingProcess(opName).opName() - - val dfInstance = if( DifferentialFunctionClassHolder.getInstance() - .hasName(nd4jOpName)) DifferentialFunctionClassHolder.getInstance().getInstance(nd4jOpName) - else DynamicCustomOp.builder(nd4jOpName).build() - Preconditions.checkState(dfInstance != null, "Could not find class for TF Ops: %s", opName) - var df: DifferentialFunction - df = try { - dfInstance.javaClass.newInstance() - } catch (t: Throwable) { - //Should never happen because function was already created via no-arg constructor earlier - throw RuntimeException(t) - } - - df.sameDiff = sd - df.ownName = name - - //Process inputs - var controlDeps: MutableList? = null - val numInputs = nd.numInputs() - - /** - * Note that ndarrays actually need to be reordered here when input indices aren't equal to what's in the original framework. - * We should potentially run the import process sooner and compute the input name - * ordering from that instead. - */ - val opDefLookup = opMappingRegistry.lookupInputFrameworkOpDef(opName) - val mappingContext = irGraph.createMappingContext( - opDef = opDefLookup, - node = irGraph.nodeByName(name), - dynamicVariables = dynamicVariables - ) - - val tensorInputMappings = HashMap() - opMappingProcess.tensorMappingRules().forEach { tensorMappingRule -> - tensorInputMappings.putAll(tensorMappingRule.inputArgumentMappings()) - } - - - - val inNames: MutableList = java.util.ArrayList(numInputs) - - for (i in 0 until numInputs) { - //use input name if it exists and matches, otherwise if the input names do not map 1 to 1 for import - //use samediff to generate a unique name - val origInName = nd.inputAt(i) - var inName = stripControl(origInName) - if (inName.endsWith(":0")) { - //Strip ":0" suffix. Some ops can depend on placeholders, like "image_tensor:0" but in SameDiff this is a variable called "image_tensor" - inName = inName.substring(0, inName.length - 2) - } - val isControlDep = isControlDep(origInName) - if (isControlDep) { - if (controlDeps == null) controlDeps = java.util.ArrayList() - controlDeps.add(inName) - } - if (!isControlDep) { - inNames.add(inName) - } - - //Update Variable.inputsForOp for all variables that feed into this op - // Such variables must have already been created, given we process in order - //declare empty variable for anything that's an input > 0 - if(!sd.hasVariable(inName) && inName.contains(':')) { - val knownBaseName = stripVarSuffix(inName) - if(!sd.hasVariable(knownBaseName)) { - throw IllegalArgumentException("No variable name found for $inName") - } else { - val knownBaseVar = sd.getVariable(stripVarSuffix(inName)) - sd.`var`( - SDVariable( - inName, - VariableType.ARRAY, - sd, - knownBaseVar.shape, - knownBaseVar.dataType() - ) - ) - - } - } - val v = sd.variables[inName] - if (v == null && df is Merge) { - //Edge case for import - we allow merge ops to be added before both inputs are available - //This is to break the cycles in loops, otherwise we can't process anything in order - mergeOpsPostProcess[df.getOwnName()] = inName - continue - } - - if (!isControlDep && (v!!.inputsForOp == null || !v.inputsForOp.contains(name))) { - //May already be present - for example, add(x,x) - if (v.inputsForOp == null) v.inputsForOp = java.util.ArrayList() - v.inputsForOp.add(name) - } else if (isControlDep) { - if (v!!.controlDepsForOp == null) v.controlDepsForOp = java.util.ArrayList() - if (!v.controlDepsForOp.contains(name)) { - v.controlDepsForOp.add(name) - } - } - } - - - val inputs = importInfoForNode!!.second.argDescriptorList.filter { input -> input.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - if(numInputs < inputs.size) { - for(i in numInputs until inputs.size) { - val newName = name + "-" + inputs[i].name - val v = sd.variables[newName]!! - if (v.inputsForOp == null) v.inputsForOp = java.util.ArrayList() - v.inputsForOp.add(newName) - inNames.add(newName) - } - - - } - - val inputNames = nd.nd4jInputs(tensorInputMappings) - - - /** - * TODO: evaluate if pre/post processing is needed. - * May need to add new input names before and after each op. - * We coudl also modularize this part of the process in general. - */ - //Create SameDiffOp instance and add to graph - val op = SameDiffOp.builder() - .name(name) - .op(df) - .inputsToOp(inNames) //.outputsOfOp(outNames) //We'll set this later - .controlDeps(controlDeps) - .build() - sd.ops[name] = op - defaultRunner.initAttributes(df, irGraph.frameworkName(), mappingContext, sd,opName) - - - /** - * TODO: Figure out if post processing is needed. - * - */ - //DType calculate for output variables (set/correct if necessary) - val newInNames = sd.ops[name]!!.inputsToOp //Just in case import has modified this, like for concat case - val newInDtypes: MutableList = - java.util.ArrayList(newInNames.size) - if (df is Merge) { - //Merge op: as noted elsewhere, we allow merge to be processed when only one of the inputs is available - // to break cycles for loops - //We know that Merge op has the restriction of the same datatype for both inputs, so we'll - val v1 = sd.getVariable(newInNames[0]) - val v2 = sd.getVariable(newInNames[1]) - val dt1 = if (v1 == null) v2!!.dataType() else v1.dataType() - val dt2 = if (v2 == null) v1!!.dataType() else v2.dataType() - newInDtypes.add(dt1) - newInDtypes.add(dt2) - } else if(df is Concat) { - //note we use the nd4j data types here so we only have input data types indexed by the actual - //output from nd4j. A common scenario import is dimensions being converted to ints - //Dimensions are converted from inputs in the input framework to plain integers elsewhere. - //This lets the import process dictate the actual ordering of the data types. - for (s in inputNames) { - val v = sd.getVariable(s) - newInDtypes.add(v.dataType()) - } - - op.inputsToOp = inputNames - } - else { - for (s in newInNames) { - val v = sd.getVariable(s) - newInDtypes.add(v.dataType()) - } - } - - //note we validate the op definition here to ensure that all ops have at least 1 output unless otherwise specified. - val outputDataTypes = df.calculateOutputDataTypes(newInDtypes) - val numOutputs = outputDataTypes.size - if(numInputs < 1 && nd4jOpName != "noop") { - throw java.lang.IllegalStateException("Op $nd4jOpName does not have any outputs!") - } - - //println("Out dtypes size ${outDTypes.size} and numOutputs $numOutputs") - val outSDVars = arrayOfNulls(numOutputs) - val outVars = arrayOfNulls(numOutputs) - val outNames: MutableList = java.util.ArrayList(numOutputs) - - //Create output variables and add to graph - for (i in 0 until numOutputs) { - val dt = outputDataTypes[i] - val varName = name + if (i == 0) "" else ":$i" - //TODO: handle variadic type in kotlin - /** - * TODO: handle data type import - */ - outSDVars[i] = sd.`var`(varName, VariableType.ARRAY, null, dt) - outNames.add(varName) - outVars[i] = Variable.builder() - .name(varName) - .variable(outSDVars[i]) - .inputsForOp(null) //This is updated incrementally as other ops are added - .controlDepsForOp(null) //Control deps are handled later - .controlDepsForVar(null) - .outputOfOp(name) - .build() - sd.variables[varName] = outVars[i] - println("Added variable to graph: $varName (output of op $name)") - } - sd.ops[name]!!.outputsOfOp = outNames - println("Imported op: $opName (name=$name)") - } - } else { - - val opMappingProcess = OpRegistryHolder.lookupOpMappingProcess< - GRAPH_TYPE, - NODE_TYPE, - OP_DEF_TYPE, - TENSOR_TYPE, - DATA_TYPE, - ATTR_DEF_TYPE, - ATTR_VALUE_TYPE>(inputFrameworkOpName = opName, inputFrameworkName = irGraph.frameworkName()) - - - - val dfInstance = if( DifferentialFunctionClassHolder.getInstance() - .hasName(opName)) DifferentialFunctionClassHolder.getInstance().getInstance(opName) - else DynamicCustomOp.builder(opName).build() - Preconditions.checkState( - dfInstance != null, - "Could not find class for ${opMappingProcess.opName()}", - opName - ) - var df: DifferentialFunction - df = try { - dfInstance.javaClass.newInstance() - } catch (t: Throwable) { - //Should never happen because function was already created via no-arg constructor earlier - throw RuntimeException(t) - } - - df.sameDiff = sd - df.ownName = name - - val opDefLookup = opMappingRegistry.lookupInputFrameworkOpDef(opName) as OP_DEF_TYPE - val mappingContext = irGraph.createMappingContext( - opDef = opDefLookup, - node = irGraph.nodeByName(name), - dynamicVariables = dynamicVariables - ) - - //Import override case - val o = importOverride[name] - println("Importing op $opName using override $importOverride") - - //First, get inputs: - val inputs: MutableList = java.util.ArrayList() - var controlDeps: MutableList? = null - val nd4jOpName = opMappingRegistry.lookupOpMappingProcess(opName).opName() - val opDescriptor = opMappingRegistry.lookupNd4jOpDef(nd4jOpName) - val opInputs = opDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - val numInputs = opInputs.size - - - for (i in 0 until numInputs) { - val inName = nodeInputTo[nd.nodeName()]!![i]!! - val controlDep = isControlDep(inName) - val v = sd.getVariable(name) - if (controlDep) { - if (controlDeps == null) controlDeps = java.util.ArrayList() - controlDeps.add(v) - } else { - inputs.add(v) - } - - o!!.initAttributes(df,irGraph.frameworkName(),mappingContext,sd,opName) - } - } - } - - - //Now that we have just added an op (or variable) - check what this feeds into, and see what we can now process - // as a result - if (nodeInputTo.containsKey(name)) { - val set: List? = nodeInputTo[name] - for (nextOp in set!!) { - val nextOpDef = remainingNodes[nextOp] - if(nextOpDef == null) - throw java.lang.IllegalStateException("No next op def found for op $nextOp") - val nInNext = nextOpDef.numInputs() - - if (nextOpDef == null) { - if (sd.ops.containsKey(nextOp)) { - //Already processed this. - //Almost certainly the close of a loop - like NextIteration -> Merge case - continue - } - throw IllegalStateException("Could not find op definition for op to import: $nextOp") - } - var allAlreadyInGraph = true - var nonControlSeenCount = 0 - - for (i in 0 until nInNext) { - val s = nextOpDef.inputAt(i) - var inName = stripControl(stripVarSuffix((nextOpDef.inputAt(i)))) - if (inName.endsWith(":0")) { - //Strip ":0" suffix. Some ops can depend on placeholders, like "image_tensor:0" but in SameDiff this is a variable called "image_tensor" - inName = inName.substring(0, inName.length - 2) - } - -// log.info("Input: {}, {}", s, inName); - if (!sd.hasVariable(inName) && !skipCase) { -// log.info("Not found: {} for op {}", inName, nextOpDef.getName()); - allAlreadyInGraph = false - break - } else if (!isControlDep(s)) { - nonControlSeenCount++ - } - } - - //Merge ops are an edge case. We'll allow these to be executed with just ONE input, to break - // the cycle in loops. In loops, generally we have (Enter, NextIteration) -> Merge, which - // of course can't be done if we strictly require all inputs to be available - val mergeCase = nonControlSeenCount > 0 && "Merge" == nextOpDef.opName() - if (allAlreadyInGraph || mergeCase) { - //Can process this op, add it to the queue for processing - if (!availableToAddSet.contains(nextOp)) { - //Avoid processing same op multiple times, for repeated inputs to one op, etc - availableToAdd.add(nextOpDef) - availableToAddSet.add(nextOp) - println("Added to processing queue: ${nextOpDef.opName()} (name=$nextOp)") - } - } - } - } - - //Finally, remove the just processed op from remainingNodes map: - remainingNodes.remove(name) - } - - //Post process the control dependencies, if any (done after because dependencies may not exist when imported) - for ((varName, cdOpNames) in constControlDeps) { - sd.variables[varName]!!.controlDeps = cdOpNames - for (s in cdOpNames) { - val sdo = sd.ops[s] - if (sdo!!.controlDepFor == null) sdo.controlDepFor = java.util.ArrayList() - val l = sdo.controlDepFor - if (!l.contains(s)) l.add(varName) - } - } - - //Post process the merge ops - all we are missing is a Variable.getInputsForOp().add(mergeOpName); - for ((key, value) in mergeOpsPostProcess) { - val v = sd.variables[value] - if (v!!.inputsForOp == null) v.inputsForOp = java.util.ArrayList() - v.inputsForOp.add(key) - } - Preconditions.checkState( - remainingNodes.isEmpty(), - "%s Unprocessed nodes: %s", - remainingNodes.size, - remainingNodes.keys - ) - return sd - } -} - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/SaveProcesesAndRules.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/SaveProcesesAndRules.kt deleted file mode 100644 index 2d79fbf53..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/SaveProcesesAndRules.kt +++ /dev/null @@ -1,20 +0,0 @@ -package org.nd4j.codegen.ir - -import org.nd4j.codegen.ir.onnx.OnnxOpDeclarations -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.nd4j.codegen.ir.registry.OpRegistryHolder -import org.nd4j.codegen.ir.tensorflow.TensorflowOpDeclarations - -class SaveProcesesAndRules { - - companion object { - @JvmStatic fun main(args : Array) { - val tensorflowDeclarations = TensorflowOpDeclarations - val onnxDeclarations = OnnxOpDeclarations - OpRegistryHolder.tensorflow().saveProcessesAndRuleSet() - OpRegistryHolder.onnx().saveProcessesAndRuleSet() - } - } - - -} \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxIR.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxIR.kt deleted file mode 100644 index 363fea03e..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxIR.kt +++ /dev/null @@ -1,859 +0,0 @@ -package org.nd4j.codegen.ir.onnx - -import onnx.Onnx -import onnx.OnnxMl -import org.apache.commons.io.FileUtils -import org.nd4j.codegen.ir.* -import org.nd4j.codegen.ir.tensorflow.AttrValue -import org.nd4j.codegen.ir.tensorflow.TensorflowIRTensor -import org.nd4j.common.io.ClassPathResource -import org.nd4j.ir.OpNamespace -import org.nd4j.ir.TensorNamespace -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray - -import kotlin.collections.HashMap -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.onnxruntime.runner.OnnxRuntimeRunner -import org.nd4j.shade.protobuf.ByteString -import org.tensorflow.framework.TensorProto -import java.io.File -import java.lang.IllegalArgumentException -import java.nio.charset.Charset -import java.util.* -import kotlin.collections.ArrayList -import kotlin.math.min - -fun loadOnnxOps(): List { - val graphProto = Onnx.GraphProto.parseFrom(ClassPathResource("onnx-op-defs.pb").inputStream) - return graphProto.nodeList -} - -val onnxops = loadOnnxOps() - -class OnnxIRTensor(input: Onnx.TensorProto): IRTensor { - - val tensor = input - - - override fun shape(): List { - return tensor.dimsList - } - - override fun stride(): List { - return Nd4j.getStrides(shape().toTypedArray().toLongArray(),'c').asList() - } - - override fun dataType(): IRDataType { - return OnnxIRDataType(Onnx.TensorProto.DataType.values()[tensor.dataType.ordinal]) - } - - override fun toArgTensor(): TensorNamespace.TensorProto { - val builder = TensorNamespace.TensorProto.newBuilder() - .setDataLocation(TensorNamespace.TensorProto.DataLocation.DEFAULT) - - for(i in 0 until tensor.dimsCount) { - builder.addDims(tensor.getDims(i)) - } - - when(tensor.dataType) { - Onnx.TensorProto.DataType.UINT64 -> builder.dataType = TensorNamespace.DataType.UINT64.ordinal - Onnx.TensorProto.DataType.UINT32 -> builder.dataType = TensorNamespace.DataType.UINT32.ordinal - Onnx.TensorProto.DataType.UINT16 -> builder.dataType = TensorNamespace.DataType.UINT16.ordinal - Onnx.TensorProto.DataType.FLOAT16 -> builder.dataType = TensorNamespace.DataType.FLOAT16.ordinal - Onnx.TensorProto.DataType.STRING -> builder.dataType = TensorNamespace.DataType.STRING.ordinal - Onnx.TensorProto.DataType.FLOAT -> builder.dataType = TensorNamespace.DataType.FLOAT.ordinal - Onnx.TensorProto.DataType.DOUBLE -> builder.dataType = TensorNamespace.DataType.DOUBLE.ordinal - Onnx.TensorProto.DataType.BOOL -> builder.dataType = TensorNamespace.DataType.BOOL.ordinal - Onnx.TensorProto.DataType.INT64 -> builder.dataType = TensorNamespace.DataType.INT64.ordinal - Onnx.TensorProto.DataType.INT32 -> builder.dataType = TensorNamespace.DataType.INT32.ordinal - Onnx.TensorProto.DataType.INT16 -> builder.dataType = TensorNamespace.DataType.INT16.ordinal - Onnx.TensorProto.DataType.COMPLEX64 -> builder.dataType = TensorNamespace.DataType.COMPLEX64.ordinal - Onnx.TensorProto.DataType.COMPLEX128 -> builder.dataType = TensorNamespace.DataType.COMPLEX128.ordinal - Onnx.TensorProto.DataType.UNDEFINED,Onnx.TensorProto.DataType.UNRECOGNIZED -> TensorNamespace.DataType.UNRECOGNIZED.ordinal - - } - - - if(tensor.doubleDataList != null && tensor.doubleDataCount > 0) { - builder.addAllDoubleData(tensor.doubleDataList) - } - - if(tensor.stringDataList != null && tensor.stringDataCount > 0) { - builder.addAllStringData(tensor.stringDataList) - } - - if(tensor.floatDataList != null && tensor.floatDataCount > 0) { - builder.addAllFloatData(tensor.floatDataList) - } - - if(tensor.int32DataList != null && tensor.int32DataCount > 0) { - builder.addAllInt32Data(tensor.int32DataList) - } - - if(tensor.int64DataCount != null && tensor.int64DataCount > 0) { - builder.addAllInt64Data(tensor.int64DataList) - } - - if(tensor.uint64DataList != null && tensor.uint64DataCount > 0) { - builder.addAllInt64Data(tensor.uint64DataList) - } - - if(tensor.rawData != null) { - builder.rawData = tensor.rawData - } - - builder.dataType = tensor.dataType.ordinal - - return builder.build() - } - - override fun rawValue(): Onnx.TensorProto { - return tensor - } - - override fun toNd4jNDArray(): INDArray { - return ndarrayFromNameSpaceTensor(toArgTensor()) - } - - -} - -class OnnxIRDataType(inputDataType: Onnx.TensorProto.DataType): IRDataType { - val dataType = inputDataType - - override fun convertToDataType(input: Onnx.TensorProto.DataType): IRDataTypeValue { - when(input) { - Onnx.TensorProto.DataType.UINT64 -> return IRDataTypeValue.DT_UINT64 - Onnx.TensorProto.DataType.UINT32 -> return IRDataTypeValue.DT_UINT32 - Onnx.TensorProto.DataType.UINT16 -> return IRDataTypeValue.DT_UINT16 - Onnx.TensorProto.DataType.FLOAT16 -> return IRDataTypeValue.DT_HALF - Onnx.TensorProto.DataType.STRING -> return IRDataTypeValue.DT_STRING - Onnx.TensorProto.DataType.FLOAT -> return IRDataTypeValue.DT_FLOAT - Onnx.TensorProto.DataType.DOUBLE -> return IRDataTypeValue.DT_DOUBLE - Onnx.TensorProto.DataType.BOOL -> return IRDataTypeValue.DT_BOOL - Onnx.TensorProto.DataType.INT64 -> return IRDataTypeValue.DT_INT64 - Onnx.TensorProto.DataType.INT32 -> return IRDataTypeValue.DT_INT32 - Onnx.TensorProto.DataType.INT16 -> return IRDataTypeValue.DT_INT16 - Onnx.TensorProto.DataType.COMPLEX64 -> return IRDataTypeValue.DT_COMPLEX64 - Onnx.TensorProto.DataType.COMPLEX128 -> return IRDataTypeValue.DT_COMPLEX128 - Onnx.TensorProto.DataType.UNDEFINED,Onnx.TensorProto.DataType.UNRECOGNIZED -> TensorNamespace.DataType.UNRECOGNIZED.ordinal - - } - - return IRDataTypeValue.DT_INVALID - } - - override fun dataType(): IRDataTypeValue { - return convertToDataType(this.dataType) - } - - override fun internalValue(): Onnx.TensorProto.DataType { - return this.dataType - } - - override fun nd4jDataType(): DataType { - when(this.dataType) { - Onnx.TensorProto.DataType.UINT64 -> return return DataType.INT64 - Onnx.TensorProto.DataType.UINT32 -> return return DataType.INT32 - Onnx.TensorProto.DataType.UINT16 -> return return DataType.INT16 - Onnx.TensorProto.DataType.FLOAT16 -> return return DataType.FLOAT16 - Onnx.TensorProto.DataType.STRING -> return return DataType.UTF8 - Onnx.TensorProto.DataType.FLOAT -> return return DataType.FLOAT - Onnx.TensorProto.DataType.DOUBLE -> return return DataType.DOUBLE - Onnx.TensorProto.DataType.BOOL -> return return DataType.BOOL - Onnx.TensorProto.DataType.INT64 -> return return DataType.INT64 - Onnx.TensorProto.DataType.INT32 -> return return DataType.INT32 - Onnx.TensorProto.DataType.INT16 -> return return DataType.INT16 - - } - - return return DataType.UNKNOWN - - } - - override fun nameSpaceDataType(): TensorNamespace.DataType { - when(this.dataType) { - Onnx.TensorProto.DataType.UINT64 -> return return TensorNamespace.DataType.INT64 - Onnx.TensorProto.DataType.UINT32 -> return return TensorNamespace.DataType.INT32 - Onnx.TensorProto.DataType.UINT16 -> return return TensorNamespace.DataType.INT16 - Onnx.TensorProto.DataType.FLOAT16 -> return return TensorNamespace.DataType.FLOAT16 - Onnx.TensorProto.DataType.STRING -> return return TensorNamespace.DataType.STRING - Onnx.TensorProto.DataType.FLOAT -> return TensorNamespace.DataType.FLOAT - Onnx.TensorProto.DataType.DOUBLE -> return TensorNamespace.DataType.DOUBLE - Onnx.TensorProto.DataType.BOOL -> return return TensorNamespace.DataType.BOOL - Onnx.TensorProto.DataType.INT64 -> return return TensorNamespace.DataType.INT64 - Onnx.TensorProto.DataType.INT32 -> return return TensorNamespace.DataType.INT32 - Onnx.TensorProto.DataType.INT16 -> return return TensorNamespace.DataType.INT16 - - } - - return TensorNamespace.DataType.UNDEFINED - } - -} - -fun attrDefaultValue(): IRAttribute { - return OnnxIRAttr(Onnx.AttributeProto.getDefaultInstance(), Onnx.AttributeProto.getDefaultInstance()) -} - -class OnnxIRAttr(inputAttributeDef: Onnx.AttributeProto, inputAttributeValue: Onnx.AttributeProto): - IRAttribute { - - private val attributeDef = inputAttributeDef - private val attributeValue = inputAttributeValue - - override fun name(): String { - return attributeDef.name - } - - override fun floatValue(): Float { - return attributeValue.f - } - - override fun listFloatValue(): List { - return attributeValue.floatsList - } - - - override fun intValue(): Long { - return attributeValue.i - } - - override fun listIntValue(): List { - return attributeValue.intsList - } - - override fun boolValue(): Boolean { - return attributeValue.i > 0 - } - - override fun listBoolValue(): List { - TODO("Implement") - } - - override fun attributeValueType(): AttributeValueType { - when(attributeDef.type) { - Onnx.AttributeProto.AttributeType.STRING -> return AttributeValueType.STRING - Onnx.AttributeProto.AttributeType.STRINGS -> return AttributeValueType.LIST_STRING - Onnx.AttributeProto.AttributeType.INT-> return AttributeValueType.INT - Onnx.AttributeProto.AttributeType.INTS -> return AttributeValueType.LIST_INT - Onnx.AttributeProto.AttributeType.FLOAT -> return AttributeValueType.FLOAT - Onnx.AttributeProto.AttributeType.FLOATS -> return AttributeValueType.LIST_FLOAT - Onnx.AttributeProto.AttributeType.TENSOR -> return AttributeValueType.TENSOR - Onnx.AttributeProto.AttributeType.TENSORS -> return AttributeValueType.LIST_TENSOR - } - - return AttributeValueType.INVALID - } - - - - override fun internalAttributeDef(): Onnx.AttributeProto { - return attributeDef - } - - override fun internalAttributeValue(): Onnx.AttributeProto { - return attributeValue - } - - override fun listTensorValue(): List> { - return attributeValue.tensorsList.map { - input -> OnnxIRTensor(input) - } - } - - override fun tensorValue(): IRTensor { - return OnnxIRTensor(attributeValue.t) - } - - override fun stringValue(): String { - return attributeValue.s.toStringUtf8() - } - - override fun listStringValue(): List { - return attributeValue.stringsList.map { it.toStringUtf8() } - } - - override fun dataTataTypeValue(): IRDataType { - return OnnxIRDataType(Onnx.TensorProto.DataType.values()[attributeDef.t.dataType.ordinal]) - } - -} - -class OnnxIRArgDef(input: Onnx.NodeProto): IRArgDef { - private val argDefValue = input - - override fun dataType(): IRDataType { - return OnnxIRArgDef(argDefValue).dataType() - } - - override fun name(): String { - return argDefValue.name - } - - override fun description(): String { - return argDefValue.docString - } - - override fun internalValue(): Onnx.NodeProto { - return argDefValue - } - - override fun indexOf(): Integer { - TODO("Not yet implemented") - } - -} - -class OnnxIROp(input: Onnx.NodeProto): IROpDef { - - val opDef = input - - override fun attributes(): List> { - return opDef.attributeList.map { - OnnxIRAttr(it, Onnx.AttributeProto.getDefaultInstance()) - } - } - - override fun opName(): String { - return opDef.name - } - - override fun internalValue(): Onnx.NodeProto { - return opDef - } - - override fun inputArgs(): List> { - return opDef.inputList.map { - OnnxIRArgDef(opDef) - } - } - - override fun outputArgs(): List> { - return opDef.outputList.map { - OnnxIRArgDef(opDef) - } - } - -} - -class OnnxIRNode(inputNode: Onnx.NodeProto, inputOpDef: Onnx.NodeProto): IRNode { - - private val nodeDef = inputNode - private val opDef = inputOpDef - private val attrDefsMap = attrDefsByName(inputOpDef.attributeList) - private val attrMap: Map> = - initAttrMapFromNode(inputNode) - private val mappingProcess: MappingProcess - init { - mappingProcess = onnxOpRegistry.lookupOpMappingProcess(inputNode.opType) - } - - private fun attrDefsByName(input: List): Map { - val ret = HashMap() - input.forEach { - ret[it.name] = it - } - return ret - } - - private fun initAttrMapFromNode(input: Onnx.NodeProto): Map> { - val ret = HashMap>() - input.attributeList.forEach { - ret[it.name] = OnnxIRAttr(it,it) - - } - return ret - } - - override fun opName(): String { - return nodeDef.opType - } - - override fun nodeName(): String { - return nodeDef.name - } - - override fun inputAt(index: Int): String { - if(mappingProcess.indexOverrides().containsKey(index)) - return nodeDef.getInput(mappingProcess.indexOverrides()[index]!!) - return nodeDef.getInput(index) - } - - override fun outputAt(index: Int): String { - return opDef.getOutput(index) - } - - - - override fun hasAttribute(inputName: String): Boolean { - return nodeDef.attributeList.filter { it.name == inputName }.size > 0 - } - - override fun attributeMap(): Map> { - return attrMap - } - - override fun createInputsFrom(inputData: List): List> { - return inputData.map { OnnxIRTensor(it) } - } - - override fun createOutputsFrom(inputValues: List): List> { - return inputValues.map { OnnxIRTensor(it) } - } - - override fun getAttribute(inputName: String): IRAttribute { - return attrMap.getOrDefault(inputName, attrDefaultValue()) - } - - override fun internalValue(): Onnx.NodeProto { - return nodeDef - } - - override fun numInputs(): Int { - return nodeDef.inputCount - } - - override fun numOutputs(): Int { - return nodeDef.outputCount - } - - override fun inputs(): List { - return nodeDef.inputList - } - - override fun outputs(): List { - return nodeDef.outputList - } - - override fun numInputsForListOfTensors(name: String): Int { - return nodeDef.inputCount - } - - override fun inputNamesForListOfInputValues(inputListName: String): List { - return nodeDef.inputList - } - - override fun computeAdjustedOffsetForInput( - nd4jName: String, - inputFrameworkName: String, - tensorInputMappings: Map - ): Int { - //onnx doesn't have lists of values like this - return lookupIndexForArgDescriptor( - argDescriptorName = nd4jName, - opDescriptorName = this.opName(), - argDescriptorType = OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR - ) - } - - override fun nd4jInputs(tensorMappings: Map): List { - return nodeDef.inputList - } - -} - - -fun Onnx.GraphProto.nodeByName(name: String): Onnx.NodeProto { - return this.nodeList.first { it.name == name }!! -} - - -class OnnxIRGraphRunner(graphDef: OnnxIRGraph,inputNames: List,outputNames: List): IRGraphRunner< - Onnx.GraphProto, - Onnx.NodeProto, - Onnx.NodeProto, - Onnx.TensorProto,Onnx.AttributeProto,Onnx.AttributeProto,Onnx.TensorProto.DataType> { - val graphDef = graphDef - val inputNames = inputNames - val outputNames = outputNames - val graphRunner: OnnxRuntimeRunner - - init { - val uuid = UUID.randomUUID().toString() - val tempFile = File("tempFile-$uuid.proto") - - val modelProto = ModelProto { - OpSetImport(OperatorSetIdProto { - version = 12 - }) - - irVersion = 7 - graph = graph().internalValue() - } - - FileUtils.writeByteArrayToFile(tempFile,modelProto.toByteArray()) - graphRunner = OnnxRuntimeRunner.builder() - .modelUri(tempFile.absolutePath) - .inputs(inputNames) - .outputs(outputNames) - .build() - tempFile.deleteOnExit() - } - - override fun graph(): IRGraph { - return graphDef - } - - override fun run(inputs: Map): Map { - return graphRunner.exec(inputs) - } - -} - - -class OnnxIRGraph(graphDef: Onnx.GraphProto): IRGraph< - Onnx.GraphProto,Onnx.NodeProto, - Onnx.NodeProto,Onnx.TensorProto,Onnx.AttributeProto,Onnx.AttributeProto, - Onnx.TensorProto.DataType> { - - val graphDef = graphDef - val opList = graphDef.nodeList - var cachedNodeList = nodeList() - override fun nodeByName(input: String): Onnx.NodeProto { - return cachedNodeList.first { inputNode -> inputNode.nodeName() == input }.internalValue() - } - - override fun nodeList(): List> { - val ret2 = ArrayList>() - //add all inputs, outputs, initializers together as "nodes" similar to TF - val identityOp = onnxops.first { op -> op.name == "Constant" } - //for model import purposes, add identity ops as dummies similar to how tensorflow does placeholders/constants - graphDef.inputList.forEach { input -> - //note: this is not a real op name in onnx, this is purely for flagging for import to grab the node from the initializer - //add dummy values for placeholders - val nodeToAdd = NodeProto { - opType = "Constant" - name = input.name - Attribute(Onnx.AttributeProto.newBuilder().setName("value"). - addTensors(Onnx.TensorProto.getDefaultInstance()).build()) - } - - ret2.add(OnnxIRNode(nodeToAdd,identityOp)) - } - - graphDef.nodeList.forEach { - val opDefOrNull = onnxops.firstOrNull { opDef -> opDef.name == it.opType } - if(opDefOrNull == null) { - throw IllegalArgumentException("Op def name ${it.opType} not found!") - } - - ret2.add(OnnxIRNode(it,opDefOrNull!!)) - } - - //create dummy nodes by inferring which nodes have outputs - //setup identity nodes that reflect the output to automatically - //map index outputs to nodes that actually have outputs - val outputNames = graphDef.outputList.map { input -> input.name }.toSet() - val outputNodes = ArrayList() - graphDef.nodeList.forEach { nodeProto -> - val outputList = nodeProto.outputList.map { input -> input.toString() }.toSet() - val containsAny = outputNames.intersect(outputList) - if(containsAny.isNotEmpty()) { - outputNodes.add(nodeProto) - } - } - - outputNodes.forEach { nodeProto -> - nodeProto.outputList.forEachIndexed { index, outputName -> - val indexOfOutput = if(index < 1) "" else ":$index" - if(!ret2.map { node -> node.nodeName() }.contains(outputName)) { - val nodeToAdd = NodeProto { - opType = "Identity" - name = outputName - Input("${nodeProto.name}$indexOfOutput") - } - - ret2.add(OnnxIRNode(nodeToAdd, identityOp)) - } - } - - } - - - - graphDef.initializerList.forEach { initializer -> - //note: this is not a real op name in onnx, this is purely for flagging for import to grab the node from the initializer - val nodeToAdd = NodeProto { - opType = "Constant" - name = initializer.name - Attribute(Onnx.AttributeProto.newBuilder().setName("value"). - addTensors(Onnx.TensorProto.getDefaultInstance()).build()) - } - - ret2.add(OnnxIRNode(nodeToAdd,identityOp)) - } - - return ret2 - } - - - fun graphDef(): Onnx.GraphProto { - return graphDef - } - - override fun internalValue(): Onnx.GraphProto { - return graphDef - } - - - - override fun createMappingContext( - opDef: Onnx.NodeProto, - node: Onnx.NodeProto, - dynamicVariables: Map - ): MappingContext { - return OnnxMappingContext(opDef = opDef,node = node,graph = this,dynamicVariables = dynamicVariables) - } - - override fun frameworkName(): String { - return "onnx" - } - - override fun nd4jNameForInternalOpName(name: String): String { - return onnxOpRegistry.lookupOpMappingProcess(name).opName() - } - - override fun isConstantOpName(name: String): Boolean { - return name == "Constant" || name == "Placeholder" - } - - override fun isConstant(opName: String): Boolean { - return opName == "Constant" - } - - override fun isPlaceHolder(opName: String): Boolean { - return opName == "Placeholder" - } - - override fun shapeOfInput(varName: String): LongArray? { - val firstOrNull = graphDef.initializerList.firstOrNull { inputNode -> inputNode.name == varName } - if(firstOrNull != null) - return firstOrNull.dimsList.toLongArray() - return null - } - - override fun dataTypeForVariable(varName: String): IRDataType { - val firstOrNull = graphDef.initializerList.firstOrNull { - inputNode -> inputNode.name == varName } - val input = graphDef.inputList.firstOrNull { input2 -> - input2.name == varName - } - if(firstOrNull != null) - return OnnxIRDataType(Onnx.TensorProto.DataType.values()[firstOrNull!!.dataType.ordinal]) - else if(input != null) - return OnnxIRDataType(input.type.tensorType.elemType) - else - return OnnxIRDataType(Onnx.TensorProto.DataType.UNDEFINED) - } - - override fun importInfoForEachNode(dynamicVariables: Map): Map, OpNamespace.OpDescriptor>> { - return importInfoForEachNodeInGraph(graph = this,dynamicVariables = dynamicVariables) - } - - override fun nodeIsPlaceHolder(nodeName: String): Boolean { - return graphDef.inputList.map { input -> input.name }.contains(nodeName) - } -} - - -class OnnxMappingContext(opDef: Onnx.NodeProto, node: Onnx.NodeProto, graph: -IRGraph< Onnx.GraphProto,Onnx.NodeProto, Onnx.NodeProto, Onnx.TensorProto, - Onnx.AttributeProto, - Onnx.AttributeProto, Onnx.TensorProto.DataType>,dynamicVariables: Map) : - AbstractMappingContext< Onnx.GraphProto,Onnx.NodeProto, Onnx.NodeProto, Onnx.TensorProto, - Onnx.AttributeProto, Onnx.AttributeProto, Onnx.TensorProto.DataType>(opDef, node, graph,dynamicVariables) { - - override fun attrDef(name: String): Onnx.AttributeProto { - val ret = opDef().attributeList.firstOrNull { it.name == name } - return ret!! - } - - override fun irAttributeValueForNode(valueName: String): IRAttribute { - val attrDef = attrDef(valueName) - var attrValue = node.attributeList.firstOrNull { it.name == valueName } - if(attrValue == null && attrDef.name == "value" && opDef.opType == "Constant") - //allow dummy values - attrValue = Onnx.AttributeProto.newBuilder().setName("value").addTensors(Onnx.TensorProto.getDefaultInstance()) - .build() - else if(attrValue == null) - throw IllegalArgumentException("Unable to resolve attribute for name $valueName for node ${nodeName()} for op type ${opName()}") - return OnnxIRAttr(inputAttributeDef = attrDef,inputAttributeValue = attrValue!!) - - } - - override fun tensorInputFor(name: String): IRTensor { - var foundIndex = -1 - opDef.inputList.forEachIndexed { - index,argDef -> if(argDef == name) foundIndex = index - } - - return tensorInputFromInputFrameworkName(name) - } - - override fun opName(): String { - return opDef.opType - } - - override fun nodeName(): String { - return opDef.name - } - - override fun nd4jDataTypeFor(input: IRTensor): DataType { - return input.dataType().nd4jDataType() - } - - override fun createIRTensorFromNDArray(ndarray: INDArray): IRTensor { - return OnnxIRTensor(convertToOnnxTensor(ndarray,"tensor")) - } - - override fun tensorAttributeFor(name: String): IRTensor { - return irAttributeValueForNode(name).tensorValue() - } - - override fun irNode(): IRNode { - return OnnxIRNode(node, onnxops.first { input -> input.name == node.opType }) - } - - override fun tensorInputFromInputFrameworkName(name: String): IRTensor { - val castedGraph = graph as OnnxIRGraph - val graphDef = castedGraph.graphDef() - var foundIndex = opDef.inputList.map { input -> input.toString() }.indexOf(name) - - - - if(foundIndex < 0) { - throw java.lang.IllegalArgumentException("Node with name ${nodeName()} for opdef with name ${opDef.name} did not contain a tensor with name ${name}") - } - - /** - * Use op definition name as 1 unified reference name in rules for static purposes, but - * look up via index for specific node mappings. - * - * This is equivalent to the tf input position attribute value in the previous tensorflow import. - */ - val graphNode = if(node.opType == "Constant") name else node.getInput(foundIndex) - val attemptedTensor = graphDef.initializerList.firstOrNull { it.name == graphNode } - - //no value to be found on placeholder, return default instance - //if no value exists it's an output from another node - if(attemptedTensor == null) { - println("Value for node $graphNode is not a constant! This method only works for constants. Consider replacing the Placeholder node with a Constant node. This will return an empty tensor.") - if(!dynamicVariables.containsKey(graphNode)) - return OnnxIRTensor(Onnx.TensorProto.getDefaultInstance()) - else { - val toConvert = dynamicVariables[graphNode]!! - return OnnxIRTensor(toConvert) - } - } - - //value nodes are the values of attributes that are input nodes in a frozen graph - if(attemptedTensor == null) { - throw IllegalArgumentException("Name $name not found in initializer list.") - } - return OnnxIRTensor(attemptedTensor!!) - } - -} - - - -fun onnxAttributeTypeFor(attributeName: String,opDef: Onnx.NodeProto): AttributeValueType { - if(isOnnxTensorName(attributeName,opDef)) - return AttributeValueType.TENSOR - return OnnxIRAttr(opDef.attributeList.first { - attributeProto -> attributeProto.name == attributeName }, - Onnx.AttributeProto.getDefaultInstance()).attributeValueType() -} - -fun isOnnxTensorName(name: String, opDef: Onnx.NodeProto): Boolean { - return opDef.inputList.contains(name) -} - - -fun isOnnxAttributeName(name: String, opDef: Onnx.NodeProto): Boolean { - return opDef.attributeList.map { attrDef -> attrDef.name }.contains(name) -} - - -fun convertToOnnxDataType(dataType: DataType): Onnx.TensorProto.DataType { - return when (dataType) { - DataType.UINT16 -> Onnx.TensorProto.DataType.UINT16 - DataType.UINT32 -> Onnx.TensorProto.DataType.UINT32 - DataType.UINT64 -> Onnx.TensorProto.DataType.UINT64 - DataType.BOOL -> Onnx.TensorProto.DataType.BOOL - DataType.FLOAT -> Onnx.TensorProto.DataType.FLOAT - DataType.INT -> Onnx.TensorProto.DataType.INT32 - DataType.LONG -> Onnx.TensorProto.DataType.INT64 - DataType.BYTE -> Onnx.TensorProto.DataType.INT8 - DataType.SHORT -> Onnx.TensorProto.DataType.INT16 - DataType.DOUBLE -> Onnx.TensorProto.DataType.DOUBLE - DataType.UBYTE -> Onnx.TensorProto.DataType.UINT8 - DataType.HALF -> Onnx.TensorProto.DataType.FLOAT16 - DataType.UTF8 -> Onnx.TensorProto.DataType.STRING - else -> throw UnsupportedOperationException("Unknown Onnx data type: [" + dataType.name + "]") - } -} - - -fun convertToOnnxTensor(inputArray: INDArray, name: String): Onnx.TensorProto { - val dtype = convertToOnnxDataType(inputArray.dataType()) - val newBuilder = Onnx.TensorProto.newBuilder() - newBuilder.dataType = dtype - newBuilder.addAllDims(inputArray.shape().toList()) - newBuilder.name = name - when(dtype) { - Onnx.TensorProto.DataType.STRING -> { - return OnnxTensorProto { - val stringList = ArrayList() - for (i in 0 until inputArray.length()) { - stringList.add(inputArray.getString(i)) - } - - newBuilder.addAllStringData(stringList.map { input -> ByteString.copyFrom(input.toByteArray(Charset.defaultCharset())) }) - } - } - - Onnx.TensorProto.DataType.DOUBLE -> { - newBuilder.addAllDoubleData(inputArray.data().asDouble().asList()) - } - - Onnx.TensorProto.DataType.FLOAT -> { - newBuilder.addAllFloatData(inputArray.data().asFloat().asList()) - } - - Onnx.TensorProto.DataType.INT32 -> { - newBuilder.addAllInt32Data(inputArray.data().asInt().asList()) - } - - Onnx.TensorProto.DataType.INT64 -> { - newBuilder.addAllInt64Data(inputArray.data().asLong().asList()) - } - - else -> { - newBuilder.rawData = ByteString.copyFrom(inputArray.data().asBytes()) - } - } - return newBuilder.build() - -} - - -fun attributeValueTypeForOnnxAttribute(attributeDef: Onnx.AttributeProto) : AttributeValueType { - when(attributeDef.type) { - Onnx.AttributeProto.AttributeType.STRING -> return AttributeValueType.STRING - Onnx.AttributeProto.AttributeType.STRINGS -> return AttributeValueType.LIST_STRING - Onnx.AttributeProto.AttributeType.INT-> return AttributeValueType.INT - Onnx.AttributeProto.AttributeType.INTS -> return AttributeValueType.LIST_INT - Onnx.AttributeProto.AttributeType.FLOAT -> return AttributeValueType.FLOAT - Onnx.AttributeProto.AttributeType.FLOATS -> return AttributeValueType.LIST_FLOAT - Onnx.AttributeProto.AttributeType.TENSOR -> return AttributeValueType.TENSOR - Onnx.AttributeProto.AttributeType.TENSORS -> return AttributeValueType.LIST_TENSOR - } - - return AttributeValueType.INVALID -} - - - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxMappingProcess.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxMappingProcess.kt deleted file mode 100644 index 4f4e3dcf8..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxMappingProcess.kt +++ /dev/null @@ -1,47 +0,0 @@ -package org.nd4j.codegen.ir.onnx - -import onnx.Onnx -import org.nd4j.codegen.ir.AbstractMappingProcess -import org.nd4j.codegen.ir.AttributeMappingRule -import org.nd4j.codegen.ir.AttributeValueType -import org.nd4j.codegen.ir.TensorMappingRule -import org.nd4j.codegen.ir.registry.OpMappingRegistry - -open class OnnxMappingProcess(inputFramework: String = "onnx", - frameworkVersion: String = "1.4", - inputFrameworkOpName: String, - opName: String, - opMappingRegistry: OpMappingRegistry, - tensorMappingRules: List> = emptyList(), - inputIndexOverrides: Map = emptyMap(), - attributeMappingRules: List> = emptyList()) - : AbstractMappingProcess( - inputFramework, - frameworkVersion, - inputFrameworkOpName, - inputIndexOverrides, - opName, - opMappingRegistry, - tensorMappingRules, - attributeMappingRules) { - override fun inputOpDefValueTypes(): Map { - val opDef = opMappingRegistry.lookupInputFrameworkOpDef(inputFrameworkOpName) - val ret = HashMap() - opDef.attributeList.forEach { attributeProto -> - ret[attributeProto.name] = attributeValueTypeForOnnxAttribute(attributeProto) - } - - return ret - } - -} - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxOpDeclarations.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxOpDeclarations.kt deleted file mode 100644 index 02d25b81e..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxOpDeclarations.kt +++ /dev/null @@ -1,996 +0,0 @@ -package org.nd4j.codegen.ir.onnx - -import onnx.Onnx -import org.nd4j.codegen.ir.ArgDescriptor -import org.nd4j.codegen.ir.AttributeMappingRule -import org.nd4j.codegen.ir.nd4jOpDescriptors -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.nd4j.codegen.ir.registry.OpRegistryHolder -import org.nd4j.ir.OpNamespace - -val onnxOpRegistry = OpMappingRegistry("onnx") -val names = mapOf( - "Acos" to "acos", - "Acosh" to "acosh", - "Asin" to "asin", - "Asinh" to "asinh", - "Atan" to "atan", - "Atanh" to "atanh", - "Cos" to "cos", - "Cosh" to "cosh", - "Erf" to "erf", - "Exp" to "exp", - "Identity" to "identity", - "Log" to "log", - "Sign" to "sign", - "Sin" to "sin", - "Sinh" to "sinh", - "Softsign" to "softsign", - "Tan" to "tan", - "Tanh" to "tanh" - -) - -val pairWiseNames = mapOf( - "And" to "boolean_and") - -val equal = OnnxMappingProcess( - inputFrameworkOpName = "Equal", - opName = "equals", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - - -val sub = OnnxMappingProcess( - inputFrameworkOpName = "Sub", - opName = "subtract", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - -val mul = OnnxMappingProcess( - inputFrameworkOpName = "Mul", - opName = "multiply", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - -val lessEqual = OnnxMappingProcess( - inputFrameworkOpName = "LessOrEqual", - opName = "less_equal", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - - -val less = OnnxMappingProcess( - inputFrameworkOpName = "Less", - opName = "less", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - -val greaterEqual = OnnxMappingProcess( - inputFrameworkOpName = "GreaterOrEqual", - opName = "greater_equal", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - - -val greater = OnnxMappingProcess( - inputFrameworkOpName = "Greater", - opName = "greater", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - -val divide = OnnxMappingProcess( - inputFrameworkOpName = "Div", - opName = "divide", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - - -val add = OnnxMappingProcess( - inputFrameworkOpName = "Add", - opName = "add", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) -//Adagrad -//Adam - - -//unmapped: select_last_index -val argMax = OnnxMappingProcess( - opName = "argmax", - inputFrameworkOpName = "ArgMax", - tensorMappingRules = listOf(NDArrayMappingRule(mappingNamesToPerform = mutableMapOf("input" to "data"))), - attributeMappingRules = listOf( - invertBooleanNumber(mapOf("keepDims" to "keepdims")), - valueMappings(mutableMapOf("dimensions" to "axis"))), - opMappingRegistry = onnxOpRegistry -) - -//unmapped: select_last_index -val argMin = OnnxMappingProcess( - opName = "argmin", - inputFrameworkOpName = "ArgMin", - tensorMappingRules = listOf(NDArrayMappingRule(mappingNamesToPerform = mutableMapOf("input" to "data"))), - attributeMappingRules = listOf( - invertBooleanNumber(mapOf("keepDims" to "keepdims")), - valueMappings(mutableMapOf("dimensions" to "axis"))), - opMappingRegistry = onnxOpRegistry -) - - -//Note: weight formats are NCHW in ONNX -val avgPool = OnnxMappingProcess( - inputFrameworkOpName = "AveragePool", - opName = "avgpool2d", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - attributeMappingRules = listOf( - argDescriptorConstant(argDescriptorConstants = listOf(ArgDescriptor { - name = "isNCHW" - int64Value = 1 - argIndex = 10 - })), - intConstant(inputName = "dH",constantValue = 0 as Integer,argumentIndex = 6)[0], - intConstant(inputName = "dW",constantValue = 0 as Integer,argumentIndex = 7)[0], - intConstant(inputName = "extraParam0",constantValue = 0 as Integer,argumentIndex = 9)[0], - stringContainsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "auto_pad",valueToTest = "SAME",argumentIndex = 8), - listAttributeValueLookup(outputAttributeValue = "pH",inputAttributeValue = "pads",indexValue = 0,argumentIndex = 4), - listAttributeValueLookup(outputAttributeValue = "pW",inputAttributeValue = "pads",indexValue = 1,argumentIndex = 5), - listAttributeValueLookup(outputAttributeValue = "sH",inputAttributeValue = "strides",indexValue = 0,argumentIndex = 2), - listAttributeValueLookup(outputAttributeValue = "sW",inputAttributeValue = "strides",indexValue = 1,argumentIndex = 3), - listAttributeValueLookup(outputAttributeValue = "kW",inputAttributeValue = "kernel_shape",indexValue = 1,argumentIndex = 1), - listAttributeValueLookup(outputAttributeValue = "kH",inputAttributeValue = "kernel_shape",indexValue = 0,argumentIndex = 0))) - -val batchNorm = OnnxMappingProcess( - opName = "batchnorm", - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "BatchNormalization", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X","mean" to "mean","variance" to "var","gamma" to "scale"))), - attributeMappingRules = listOf(valueMappings(mapOf("epsilon" to "epsilon")), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0], - booleanConstant(inputName = "applyGamma",constantValue = true,argumentIndex = 1)[0], - booleanConstant(inputName = "applyBeta",constantValue = true,argumentIndex = 2)[0], - intConstant(inputName = "applyScale",constantValue = 1 as Integer,argumentIndex = 0)[0], - intConstant(inputName = "applyOffset",constantValue = 1 as Integer,argumentIndex = 1)[0] - )) -//TODO: Binarizer -//TODO: Bitshift -//TODO: Cast -//TODO: CastMap -//TODO: CategoryMapper -//TODO: Celu -//TODO: Clip -//TODO: Compress -val concat = OnnxMappingProcess( - opName = "concat", - inputFrameworkOpName = "Concat", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "inputs"))), - attributeMappingRules = listOf(valueMappings(mapOf("concatDimension" to "axis")), - booleanConstant(inputName = "isDynamicAxis",constantValue = false,argumentIndex = 0)[0]) - -) -//TODO: ConcatFromSequence -val constantFill = OnnxMappingProcess( - opName = "fill", - inputFrameworkOpName = "ConstantOfShape", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("shapeArray" to "input"))), - attributeMappingRules = listOf(ndarrayAttributeToScalarAttribute(outputAttributeValue = "value",inputAttributeValue = "value"), - intConstant(inputName = "outputDataType",constantValue = 0 as Integer,argumentIndex = 0)[0]) -) - -//TODO: ConvInteger -//TODO: ConvTranspose -val cumSum = OnnxMappingProcess( - opName = "cumsum", - inputFrameworkOpName = "CumSum", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x"))), - attributeMappingRules = listOf(valueMappings(mapOf("exclusive" to "exclusive","reverse" to "reverse")), - ndarrayToIntList(ndarrayNameToAttributeName = mutableMapOf("dimensions" to "axis"))) -) - -val depthToSpace = OnnxMappingProcess( - opName = "depth_to_space", - inputFrameworkOpName = "DepthToSpace", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - //note onnx is NCHW by default - attributeMappingRules = listOf(valueMappings(mapOf("block_size" to "blocksize")), - intConstant(inputName = "isNHWC",constantValue = 1 as Integer,argumentIndex = 1)[0]), - opMappingRegistry = onnxOpRegistry -) - -//TODO: DequantizeLinear -val determinant = OnnxMappingProcess( - opName = "matrix_determinant", - inputFrameworkOpName = "Det", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry -) - - -//TODO: DictVectorizer -//Dropout: Note https://github.com/eclipse/deeplearning4j/issues/5650 -val dropout = OnnxMappingProcess( - opName = "dropout_inverted", - inputFrameworkOpName = "Dropout", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf(convertNDArrayInputToScalarAttr(outputAttributeValue = "p" ,inputAttributeValue = "ratio")), - opMappingRegistry = onnxOpRegistry -) - - -val floor = OnnxMappingProcess( - opName = "floor", - inputFrameworkOpName = "Floor", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry -) - -val round = OnnxMappingProcess( - opName = "round", - inputFrameworkOpName = "Round", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry -) - -val mod = OnnxMappingProcess( - opName = "mod", - inputFrameworkOpName = "Mod", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry -) - - -val sigmoid = OnnxMappingProcess( - opName = "sigmoid", - inputFrameworkOpName = "Sigmoid", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - opMappingRegistry = onnxOpRegistry -) - - -val logSoftmax = OnnxMappingProcess( - opName = "log_softmax", - inputFrameworkOpName = "LogSoftmax", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf(valueMappings(mutableMapOf("dimension" to "axis"))), - opMappingRegistry = onnxOpRegistry -) -val softmax = OnnxMappingProcess( - opName = "softmax", - inputFrameworkOpName = "Softmax", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf(valueMappings(mutableMapOf("dimension" to "axis")), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0]), - opMappingRegistry = onnxOpRegistry -) - - -//TODO: DynamicQuantizeLinear -//TODO: Einsum -//TODO: Expand -//TODO: EyeLike -//TODO: FeatureVectorizer -//TODO: Flatten -val gru = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "GRU", - opName = "gruCell", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "input" to "X", - "Wru" to "R", - "Wc" to "W", - "bc" to "B", - "hLast" to "initial_h", - //TODO: erroneous mappings - "bru" to "B"))), - attributeMappingRules = listOf() -) - -val gather = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "Gather", - opName = "gather", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("indices" to "indices","input" to "data"))), - attributeMappingRules = listOf(valueMappings(mapOf("dimensions" to "axis")), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0]) -) -//TODO: GatherElements -val gatherNd = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "GatherND", - opName = "gather_nd", - attributeMappingRules = booleanConstant(inputName = "checkIndices",constantValue = true,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("indices" to "indices","input" to "data"))) -) - - -val gemm = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "Gemm", - opName = "matmul", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))), - attributeMappingRules = listOf(valueMappings(mapOf("alpha" to "alpha","beta" to "beta")), - booleanConstant(inputName = "transposeZ",constantValue = false,argumentIndex = 2)[0], - invertBooleanNumber(mutableMapOf("transposeX" to "transA","transposeY" to "transB"))) -) -//TODO: GlobalAveragePool -//TODO: GlobalLpPool -//TODO: GlobalMaxPool -//TODO: Gradient -//TODO: GraphCall -val hardSigmoid = OnnxMappingProcess( - opName = "hard_sigmoid", - inputFrameworkOpName = "HardSigmoid", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))) -) - - - -//TODO: map is-negative,is-positive -val isInf = OnnxMappingProcess( - opName = "isinf", - inputFrameworkOpName = "IsInf", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = booleanConstant(inputName = "inPlace", constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))) -) - - - -val or = OnnxMappingProcess( - opName = "or", - inputFrameworkOpName = "Or", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = listOf(booleanConstant(inputName = "inPlace", constantValue = false,argumentIndex = 0)[0], - doubleConstant(inputName = "comparable", constantValue = 0.0,argumentIndex = 0)[0]), - tensorMappingRules = listOf(mappingNDArrayInputs((mutableMapOf("input" to "A","y" to "B")))) -) - -val xor = OnnxMappingProcess( - opName = "bitwise_xor", - inputFrameworkOpName = "Xor", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = listOf(booleanConstant(inputName = "inPlace", constantValue = false,argumentIndex = 0)[0]), - tensorMappingRules = listOf(mappingNDArrayInputs((mutableMapOf("input" to "A","y" to "B")))) -) - - - -//TODO: Hardmax -//TODO: If -//TODO: Imputer -//TODO: InstanceNormalization -val lrn = OnnxMappingProcess( - opName = "lrn", - inputFrameworkOpName = "LRN", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - attributeMappingRules = listOf(valueMappings(mapOf("alpha" to "alpha","beta" to "beta","bias" to "bias","depth" to "size")), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0]) - -) - -//0=tanh, 1=relu, 2=sigmoid, 3=affine, 4=leaky relu, 5= thresholded relu, 6=scaled tanh, 7=hard sigmoid, 8=ELU, 9=softsign, 10=softplus - -val lstmActivationMap = mapOf( - "Relu" to 1, - "Tanh" to 0, - "Sigmoid" to 2, - "Affine" to 3, - "LeakyRelu" to 4, - "ThresholdedRelu" to 5, - "ScaledTanh" to 6, - "HardSigmoid" to 7, - "Elu" to 8, - "Softsign" to 9, - "Softplus" to 10 -) - -val lstm = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "LSTM", - opName = "lstmLayer", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "input" to "X", - "Wx" to "W", - "Wr" to "R", - "Wp" to "P", - "b" to "B", - "seqLen" to "sequence_lens", - "hI" to "initial_h", - "cI" to "initial_c"))), - attributeMappingRules = listOf(valueMappings(mapOf("cellClip" to "clip")), - stringToIndex(outputAttributeValue = "directionMode", - inputAttributeValue = "direction", - listOfValues = listOf("forward","reverse","bidirectional"),argumentIndex = 1), - intConstant(inputName = "dataFormat",constantValue = 0 as Integer,argumentIndex = 0)[0], - booleanConstant(inputName = "hasBiases",constantValue = true,argumentIndex = 0)[0], - booleanConstant(inputName = "hasSeqLen",constantValue = true,argumentIndex = 1)[0], - booleanConstant(inputName = "hasInitH",constantValue = true,argumentIndex = 2)[0], - booleanConstant(inputName = "hasInitC",constantValue = true,argumentIndex = 3)[0], - booleanConstant(inputName = "hasPH",constantValue = true,argumentIndex = 4)[0], - booleanConstant(inputName = "retFullSeq",constantValue = true,argumentIndex = 5)[0], - booleanConstant(inputName = "retLastH",constantValue = true,argumentIndex = 6)[0], - booleanConstant(inputName = "retLastC",constantValue = true,argumentIndex = 7)[0], - listAttributeValueLookup(outputAttributeValue = "gateAlpha",inputAttributeValue = "activation_alpha",indexValue = 0,argumentIndex = 1), - listAttributeValueLookup(outputAttributeValue = "cellAlpha",inputAttributeValue = "activation_alpha",indexValue = 1,argumentIndex = 3), - listAttributeValueLookup(outputAttributeValue = "outAlpha",inputAttributeValue = "activation_alpha",indexValue = 2,argumentIndex = 5), - listAttributeValueLookup(outputAttributeValue = "gateBeta",inputAttributeValue = "activation_beta",indexValue = 0,argumentIndex = 2), - listAttributeValueLookup(outputAttributeValue = "cellBeta",inputAttributeValue = "activation_beta",indexValue = 1,argumentIndex = 4), - listAttributeValueLookup(outputAttributeValue = "outBeta",inputAttributeValue = "activation_beta",indexValue = 2,argumentIndex = 6), - mapStringToInt(outputAttributeValue = "gateAct",inputAttributeValue = "activations",argumentIndex = 2,mapOfValuesToInts = lstmActivationMap,lookupIndex = 0), - mapStringToInt(outputAttributeValue = "cellAct",inputAttributeValue = "activations",argumentIndex = 3,mapOfValuesToInts =lstmActivationMap,lookupIndex = 1), - mapStringToInt(outputAttributeValue = "outAct",inputAttributeValue = "activations",argumentIndex = 4,mapOfValuesToInts = lstmActivationMap,lookupIndex = 2)) -) -//TODO: LabelEncoder -val leakyRelu = OnnxMappingProcess( - inputFrameworkOpName = "LeakyRelu", - opName = "leakyrelu", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - attributeMappingRules = listOf(valueMappings(mapOf("alpha" to "alpha"))), - opMappingRegistry = onnxOpRegistry -) -//TODO: LinearClassifier -//TODO: LinearRegressor -//TODO: Loop -//TODO: LpNormalization -//TODO: LpPool -val matMul = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "MatMul", - opName = "matmul", - attributeMappingRules = listOf(booleanConstant(inputName = "transposeX",constantValue = false,argumentIndex = 0)[0], - booleanConstant(inputName = "transposeY",constantValue = false,argumentIndex = 1)[0], - booleanConstant(inputName = "transposeZ",constantValue = false,argumentIndex = 2)[0], - doubleConstant(inputName = "alpha",constantValue = 0.0,argumentIndex = 0)[0], - doubleConstant(inputName = "beta",constantValue = 1.0,argumentIndex = 1)[0]), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "A","y" to "B"))) -) - - -//TODO: MatMulInteger -//TODO: Max -val maxPool = OnnxMappingProcess( - inputFrameworkOpName = "MaxPool", - opName = "maxpool2d", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - attributeMappingRules = listOf( - argDescriptorConstant(argDescriptorConstants = listOf(ArgDescriptor { - name = "isNCHW" - int64Value = 1 - argIndex = 10 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - })), - intConstant(inputName = "extraParam0",argumentIndex = 9,constantValue = 0 as Integer)[0], - stringContainsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "auto_pad",valueToTest = "SAME",argumentIndex = 8), - listAttributeValueLookup(outputAttributeValue = "dH",inputAttributeValue = "dilations",indexValue = 0,argumentIndex = 6), - listAttributeValueLookup(outputAttributeValue = "dW",inputAttributeValue = "dilations",indexValue = 1,argumentIndex = 7), - listAttributeValueLookup(outputAttributeValue = "pH",inputAttributeValue = "pads",indexValue = 0,argumentIndex = 4), - listAttributeValueLookup(outputAttributeValue = "pW",inputAttributeValue = "pads",indexValue = 1,argumentIndex = 5), - listAttributeValueLookup(outputAttributeValue = "sH",inputAttributeValue = "strides",indexValue = 0,argumentIndex = 2), - listAttributeValueLookup(outputAttributeValue = "sW",inputAttributeValue = "strides",indexValue = 1,argumentIndex = 3), - listAttributeValueLookup(outputAttributeValue = "kH",inputAttributeValue = "kernel_shape",indexValue = 0,argumentIndex = 0), - listAttributeValueLookup(outputAttributeValue = "kW",inputAttributeValue = "kernel_shape",indexValue = 1,argumentIndex = 1))) - - -//TODO: MaxRoiPool -//TODO: MaxUnpool -//TODO: name: "MeanVarianceNormalization" -//todo: Momentum -//TODO: Multinomial -//TODO: NegativeLogLikelihoodLoss -val nonMaxSuppression = OnnxMappingProcess( - inputFrameworkOpName = "NonMaxSuppression", - opName = "non_max_suppression_v3", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("maxOutputSize" to "max_output_boxes_per_class"))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "boxes" to "boxes", - "scales" to "scores", - "maxOutSize" to "max_output_boxes_per_class", - "iouThreshold" to "iou_threshold", - "scoreThreshold" to "score_threshold"))) -) -//TODO: NonZero PRIORITIZE -//TODO: Normalizer -//TODO: OneHot -//TODO: OneHotEncoder -//TODO: look at broadcasting rules between slope input -val pRelu = OnnxMappingProcess( - inputFrameworkOpName = "PRelu", - opName = "prelu", - //TODO: verify default value - attributeMappingRules = listOf(argDescriptorConstant(listOf( - ArgDescriptor { - name = "sharedAxes" - argIndex = 0 - int64Value = -1 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - } - ))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X","alpha" to "slope"))), - opMappingRegistry = onnxOpRegistry -) - -val pad = OnnxMappingProcess( - inputFrameworkOpName = "Pad", - opMappingRegistry = onnxOpRegistry, - opName = "pad", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data","paddings" to "pads"))), - attributeMappingRules = listOf( - stringToIndex(outputAttributeValue = "mode",inputAttributeValue = "mode",listOfValues = listOf("constant","reflect","edge"),argumentIndex = 0), - doubleConstant(inputName = "padValue",constantValue = 0.0,argumentIndex = 0)[0]) -) - -//TODO: QLinearConv -//TODO: QLinearMatMul -//TODO: QuantizeLinear -//TODO: RNN PRIORITIZE -val randomNormal = OnnxMappingProcess( - inputFrameworkOpName = "RandomNormal", - opName = "random_normal", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = listOf(listNumberToNDarray(outputAttributeValue = "input",inputAttributeValue = "shape")) -) - - -//TODO: RandomNormalLike -//TODO: Note that the attributes for random unifrom are wrong and needed to be discovered through other means. -//The combination of a lack of a java class + the c++ calling out to other functions which had the actual parameters -//names prevented resolution of the real parameter names. May have to look in to values that are passed inline in to functions and look up -//parameter names that way. - -val randomUniform = OnnxMappingProcess( - inputFrameworkOpName = "RandomUniform", - opName = "randomuniform", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = listOf(valueMappings(mapOf("min" to "low","max" to "high")), - listNumberToNDarray(outputAttributeValue = "shape",inputAttributeValue = "shape")) -) - -//TODO: RandomUniformLike -val range = OnnxMappingProcess( - inputFrameworkOpName = "Range", - opName = "range", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("from" to "start","to" to "limit","step" to "delta"))), - attributeMappingRules = listOf( - convertNDArrayInputToScalarAttr(outputAttributeValue = "from",inputAttributeValue = "start"), - convertNDArrayInputToScalarAttr(outputAttributeValue = "to",inputAttributeValue = "limit"), - convertNDArrayInputToScalarAttr(outputAttributeValue = "step",inputAttributeValue = "delta")) -) - -val neg = OnnxMappingProcess( - opName = "neg", - inputFrameworkOpName = "Neg", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))) -) - - -val norm1 = OnnxMappingProcess( - inputFrameworkOpName = "ReduceL1", - opMappingRegistry = onnxOpRegistry, - opName = "reduce_norm1", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf(invertBooleanNumber(mapOf("keepDims" to "keepdims")), - listNumberToListNumber(outputAttributeValue = "dimensions",inputAttributeValue = "axes")) - -) - -val norm2 = OnnxMappingProcess( - inputFrameworkOpName = "ReduceL2", - opMappingRegistry = onnxOpRegistry, - opName = "reduce_norm2", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf( - invertBooleanNumber(mapOf("keepDims" to "keepdims")), - listNumberToListNumber(outputAttributeValue = "dimensions",inputAttributeValue = "axes")) -) - -//TODO: ReduceLogSum -val reduceLogSumExp = OnnxMappingProcess( - inputFrameworkOpName = "ReduceLogSumExp", - opName = "reduce_logsumexp", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf( - invertBooleanNumber(mutableMapOf("keepDims" to "keepdims")), - valueMappings(mutableMapOf("keepDims" to "keepdims")), - listNumberToListNumber(outputAttributeValue = "dimensions",inputAttributeValue = "axes")), - opMappingRegistry = onnxOpRegistry -) -val reduceMax = OnnxMappingProcess( - inputFrameworkOpName = "ReduceMax", - opName = "reduce_max", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf( - invertBooleanNumber(mapOf("keepDims" to "keepdims")), - listNumberToListNumber(outputAttributeValue = "dimensions",inputAttributeValue = "axes")), - opMappingRegistry = onnxOpRegistry -) -val reduceMean = OnnxMappingProcess( - inputFrameworkOpName = "ReduceMean", - opName = "reduce_mean", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf( - invertBooleanNumber(mapOf("keepDims" to "keepdims")), - listNumberToListNumber(outputAttributeValue = "dimensions",inputAttributeValue = "axes")), - opMappingRegistry = onnxOpRegistry -) -val reduceMin = OnnxMappingProcess( - inputFrameworkOpName = "ReduceMin", - opName = "reduce_min", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf( - invertBooleanNumber(mapOf("keepDims" to "keepdims")), - listNumberToListNumber(outputAttributeValue = "dimensions",inputAttributeValue = "axes")), - opMappingRegistry = onnxOpRegistry -) -val reduceProd = OnnxMappingProcess( - inputFrameworkOpName = "ReduceProd", - opName = "reduce_prod", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf(invertBooleanNumber(mapOf("keepDims" to "keepdims")), - listNumberToListNumber(outputAttributeValue = "dimensions",inputAttributeValue = "axes")), - opMappingRegistry = onnxOpRegistry -) - -val reduceSum = OnnxMappingProcess( - inputFrameworkOpName = "ReduceSum", - opName = "reduce_sum", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf(invertBooleanNumber(mapOf("keepDims" to "keepdims")), - listNumberToListNumber(outputAttributeValue = "dimensions",inputAttributeValue = "axes")), - opMappingRegistry = onnxOpRegistry -) -//TODO: ReduceSumSquare -//TODO: Resize PRIORITIZE -//TODO: ReverseSequence -//TODO: RoiAlign -//TODO: SVMClassifier -//TODO: SVMRegressor -//TODO: Scaler -//TODO: Scan -val scatter = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "ScatterElements", - opName = "scatter_update", - attributeMappingRules = listOf( - valueMappings(mutableMapOf("dimension" to "axis")), - ndarrayToIntList(ndarrayNameToAttributeName = mutableMapOf("indices" to "indices"))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("operand" to "data","updates" to "updates"))) -) - -/* -val scatterNd = OnnxMappingProcess( - opName = "scatter_nd_update", - inputFrameworkOpName = "ScatterNd", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data","indices" to "indices","updates" to "updates"))), - opMappingRegistry = onnxOpRegistry -) -*/ - -//TODO: SequenceAt -//TODO: SequenceConstruct -//TODO: SequenceErase -//TODO: SequenceInsert -//TODO: SequenceLength -val shape = OnnxMappingProcess( - opName = "shape_of", - inputFrameworkOpName = "Shape", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs((mutableMapOf("input" to "data")))) -) -//TODO: Shrink - -val not = OnnxMappingProcess( - opName = "not", - inputFrameworkOpName = "Not", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = doubleConstant(inputName = "comparable",constantValue = 0.0,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs((mutableMapOf("input" to "X")))) -) - - -val pow = OnnxMappingProcess( - opName = "pow_pairwise", - inputFrameworkOpName = "Pow", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = listOf( - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0]), - tensorMappingRules = listOf(mappingNDArrayInputs((mutableMapOf("input" to "X","y" to "Y")))) -) - -val size = OnnxMappingProcess( - opName = "size", - inputFrameworkOpName = "Size", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs((mutableMapOf("input" to "data")))) -) - -//TODO: map axes -//TODO: slice and strided slice work too differently,revisit one -/*val slice = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "Slice", - opName = "strided_slice", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("v_begin" to "starts","v_end" to "ends","v_stride" to "steps", - //TODO: note these mappings are erroneous, we need better default values here for equivalent functionality in onnx - "begin_mask" to "begin","end_mask" to "end"))) -)*/ - - -//TODO: SoftmaxCrossEntropyLoss -val spaceToDepth = OnnxMappingProcess( - opName = "space_to_depth", - inputFrameworkOpName = "SpaceToDepth", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf(valueMappings(mapOf("block_size" to "blocksize")), - argDescriptorConstant(listOf(ArgDescriptor { - name = "isNHWC" - int64Value = 1 - argIndex = 1 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - - }))), - opMappingRegistry = onnxOpRegistry -) - -//TODO: don't know a good default value for num_splits, look at TF and implementation in libnd4j to figure out best value -val split = OnnxMappingProcess( - opName = "split", - inputFrameworkOpName = "Split", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("a" to "input"))), - attributeMappingRules = listOf(valueMappings(mapOf("dimensions" to "axis")), - intConstant(inputName = "num_splits",constantValue = 0 as Integer,argumentIndex = 0)[0], - listNumberToNDarray(outputAttributeValue = "b" ,inputAttributeValue = "split")) -) - -val sqrt = OnnxMappingProcess( - opName = "sqrt", - inputFrameworkOpName = "Sqrt", - opMappingRegistry = onnxOpRegistry, - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs((mutableMapOf("input" to "X")))) -) - -val softplus = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "Softplus", - opName = "softplus", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))) -) - -//TODO: SplitToSequence -val squeeze = OnnxMappingProcess( - opName = "squeeze", - inputFrameworkOpName = "Squeeze", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf(convertNumericalListToNDArray(outputAttributeValue = "a" ,inputAttributeValue = "axes"), - listNumberToListNumber(outputAttributeValue = "_a",inputAttributeValue = "axes")) -) - -//TODO: StringNormalizer -//TODO: TfIdfVectorizer -//TODO: ThresholdedRelu -val tile = OnnxMappingProcess( - opMappingRegistry = onnxOpRegistry, - inputFrameworkOpName = "Tile", - opName = "tile", - attributeMappingRules = listOf(booleanConstant(inputName = "is_static_reps",constantValue = true,argumentIndex = 0)[0], - intConstant(inputName = "dimensions",constantValue = 0 as Integer,argumentIndex = 0)[0]), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input","reps_vector" to "repeats"))) -) - -val topK = OnnxMappingProcess( - opName = "top_k", - inputFrameworkOpName = "TopK", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "X"))), - attributeMappingRules = listOf( - invertBooleanNumber(mutableMapOf("needSort" to "sorted")), - convertNDArrayInputToScalarAttr(outputAttributeValue = "k",inputAttributeValue = "K")), - opMappingRegistry = onnxOpRegistry -) - -val transpose = OnnxMappingProcess( - opName = "transpose", - inputFrameworkOpName = "Transpose", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - attributeMappingRules = listOf(listNumberToNDarray(outputAttributeValue = "permuteDims", inputAttributeValue = "perm")), - opMappingRegistry = onnxOpRegistry -) - -//TODO: TreeEnsembleClassifier -//TODO: TreeEnsembleRegressor -//TODO: Unique PRIORITIZE -//TODO: Unsqueeze PRIORITIZE -//TODO: Upsample PRIORITIZE -//TODO: Where PRIORITIZE -//TODO: ZipMap -fun defOnnxSingleTransform(opName: String, inputFrameworkOpName: String, outputName: String, inputFrameworkInput: String = "input", attributeMappingRules: List> = emptyList()): OnnxMappingProcess { - return OnnxMappingProcess( - opName = opName, - tensorMappingRules = listOf( - NDArrayMappingRule(mappingNamesToPerform = mutableMapOf(outputName to inputFrameworkInput))), - inputFrameworkOpName = inputFrameworkOpName, - inputFramework = "onnx", - attributeMappingRules = attributeMappingRules, - opMappingRegistry = onnxOpRegistry) -} - -fun defineOnnxPairwiseTransforms(opName: String, inputFrameworkOpName: String, - firstOutputName: String = "input", - secondOutputName: String = "y", - firstInput: String = "A", secondInput: String = "B") : OnnxMappingProcess { - return OnnxMappingProcess( - opName = opName, - tensorMappingRules = listOf(NDArrayMappingRule(mappingNamesToPerform = mutableMapOf( - firstOutputName to firstInput, - secondOutputName to secondInput))), - inputFrameworkOpName = inputFrameworkOpName, - inputFramework = "onnx", - opMappingRegistry = onnxOpRegistry) -} - -fun defineOnnxSingleTransform(inputOpName: String, inputFrameworkOpName: String): OnnxMappingProcess { - return OnnxMappingProcess( - opName = inputOpName, - inputFrameworkOpName = inputFrameworkOpName, tensorMappingRules = listOf(NDArrayMappingRule( - mappingNamesToPerform = mutableMapOf("input" to "input"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - -} - - -fun booleanConstant(inputName: String, constantValue: Boolean,argumentIndex: Int): List { - return listOf(argDescriptorConstant(listOf( - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.BOOL - name = inputName - argIndex = argumentIndex - boolValue = constantValue - } - ))) -} - -fun doubleConstant(inputName: String, constantValue: Double,argumentIndex: Int): List { - - return listOf(argDescriptorConstant(listOf( - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - name = inputName - argIndex = argumentIndex - doubleValue = constantValue - } - ))) -} - -fun intConstant(inputName: String, constantValue: Integer,argumentIndex: Int): List { - return listOf(argDescriptorConstant(listOf( - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = inputName - argIndex = argumentIndex - int64Value = constantValue.toLong() - } - ))) -} - - -val abs = OnnxMappingProcess( - opName = "abs", tensorMappingRules = listOf(NDArrayMappingRule(mappingNamesToPerform = mutableMapOf("input" to "X"))), - inputFrameworkOpName = "Abs", - inputFramework = "onnx", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = onnxOpRegistry) - - - -val ceil = defOnnxSingleTransform(inputFrameworkOpName = "Ceil",opName = "ceil",inputFrameworkInput = "X",outputName = "input", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - - -val const = OnnxMappingProcess( - inputFrameworkOpName = "Constant", - opName = "noop", - opMappingRegistry = onnxOpRegistry, - tensorMappingRules = listOf(), - attributeMappingRules = listOf()) - - -val conv2d = OnnxMappingProcess( - inputFramework = "onnx", - inputFrameworkOpName = "Conv", - opName = "conv2d", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "input" to "X","weights" to "W","bias" to "B"))), - attributeMappingRules = listOf( - intConstant(inputName = "isNCHW",constantValue = 1 as Integer,argumentIndex = 9)[0], - intConstant(inputName = "wFormat",constantValue = 1 as Integer,argumentIndex = 10)[0], - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "auto_pad",valueToTest = "SAME",argumentIndex = 8), - listAttributeValueLookup(outputAttributeValue = "dH",inputAttributeValue = "dilations",indexValue = 0,argumentIndex = 6), - listAttributeValueLookup(outputAttributeValue = "dW",inputAttributeValue = "dilations",indexValue = 1,argumentIndex = 7), - listAttributeValueLookup(outputAttributeValue = "pH",inputAttributeValue = "pads",indexValue = 0,argumentIndex = 4), - listAttributeValueLookup(outputAttributeValue = "pW",inputAttributeValue = "pads",indexValue = 1,argumentIndex = 5), - listAttributeValueLookup(outputAttributeValue = "sH",inputAttributeValue = "strides",indexValue = 0,argumentIndex = 2), - listAttributeValueLookup(outputAttributeValue = "sW",inputAttributeValue = "strides",indexValue = 1,argumentIndex = 3), - listAttributeValueLookup(outputAttributeValue = "kW",inputAttributeValue = "kernel_shape",indexValue = 1,argumentIndex = 0), - listAttributeValueLookup(outputAttributeValue = "kH",inputAttributeValue = "kernel_shape",indexValue = 0,argumentIndex = 1) - ),opMappingRegistry = onnxOpRegistry) - -val elu = defOnnxSingleTransform(opName = "elu",inputFrameworkOpName = "Elu",outputName = "input",inputFrameworkInput = "X", - attributeMappingRules = listOf(valueMappings(mutableMapOf("alpha" to "alpha")))) - - - -val relu = defOnnxSingleTransform(inputFrameworkOpName = "Relu",opName = "relu",inputFrameworkInput = "X",outputName = "input", - attributeMappingRules = listOf(booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0], - doubleConstant(inputName = "cutoff",constantValue = 0.0,argumentIndex = 0)[0])) - -val isNan = defOnnxSingleTransform(inputFrameworkOpName = "IsNaN",opName = "isnan",inputFrameworkInput = "X",outputName = "input", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - - -val selu = defOnnxSingleTransform(inputFrameworkOpName = "Selu",opName = "selu",inputFrameworkInput = "X",outputName = "input",attributeMappingRules = -booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - - -object OnnxOpDeclarations { - init { - val groupedOps = onnxops.groupBy { input -> input.name } - val singleGroupedOps = HashMap() - groupedOps.forEach { name,node -> - singleGroupedOps[name] = node[0] - } - - OpRegistryHolder.registerOpList("onnx", singleGroupedOps) - - names.forEach { - defineOnnxSingleTransform(inputFrameworkOpName = it.key,inputOpName = it.value) - } ?: "Error initializing single defined transforms in onnx." - - pairWiseNames.forEach { - defineOnnxPairwiseTransforms(opName = it.value,inputFrameworkOpName = it.key) - } ?: "Error initializing pair wise transforms" - - onnxops.forEach { - onnxOpRegistry.registerInputFrameworkOpDef(it.name,it) - } - - nd4jOpDescriptors.opListList.forEach { - onnxOpRegistry.registerNd4jOpDef(it.name,it) - } - - OpRegistryHolder.registerOpMappingRegistry("onnx", onnxOpRegistry) - - } -} - - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxProtobufExtensions.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxProtobufExtensions.kt deleted file mode 100644 index f7394ddd7..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxProtobufExtensions.kt +++ /dev/null @@ -1,176 +0,0 @@ -package org.nd4j.codegen.ir.onnx - -import onnx.Onnx -import onnx.OnnxOperators -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.shade.protobuf.ByteString -import java.nio.charset.Charset - -fun NodeProto(block: Onnx.NodeProto.Builder.() -> Unit): Onnx.NodeProto { - return Onnx.NodeProto.newBuilder().apply(block).build() -} - -fun AttributeProto(block: Onnx.AttributeProto.Builder.() -> Unit) : Onnx.AttributeProto { - return Onnx.AttributeProto.newBuilder().apply { block }.build() -} - -fun Onnx.AttributeProto.Builder.TensorValue(inputValue: Onnx.TensorProto) { - this.addTensors(inputValue) -} - -fun Onnx.AttributeProto.Builder.StringValue(inputValue: String) { - this.addStrings(ByteString.copyFrom(inputValue.toByteArray(Charset.defaultCharset()))) -} - -fun Onnx.NodeProto.Builder.Attribute(attribute: Onnx.AttributeProto) { - this.addAttribute(attribute) -} - -fun Onnx.NodeProto.Builder.Input(inputName: String) { - this.addInput(inputName) -} - -fun Onnx.NodeProto.Builder.Output(inputName: String) { - this.addOutput(inputName) -} - -fun Onnx.GraphProto.Builder.Initializer(tensor: Onnx.TensorProto) { - this.addInitializer(tensor) -} - -fun OperatorSetIdProto(block: Onnx.OperatorSetIdProto.Builder.() -> Unit): Onnx.OperatorSetIdProto { - return Onnx.OperatorSetIdProto.newBuilder().apply(block).build() -} - -fun OperatorSetProto(block: OnnxOperators.OperatorSetProto.Builder.() -> Unit): OnnxOperators.OperatorSetProto { - return OnnxOperators.OperatorSetProto.newBuilder().apply { block }.build() -} - -fun Onnx.ModelProto.Builder.OpSetImport(opSetImport: Onnx.OperatorSetIdProto) { - this.addOpsetImport(opSetImport) -} - -fun ModelProto(block: Onnx.ModelProto.Builder.() -> Unit): Onnx.ModelProto { - return Onnx.ModelProto.newBuilder() - .apply(block).build() -} - -fun TensorDefinition(block: Onnx.TypeProto.Tensor.Builder.() -> Unit) : Onnx.TypeProto.Tensor { - return Onnx.TypeProto.Tensor.newBuilder().apply(block).build() -} - -fun TypeProto(block: Onnx.TypeProto.Builder.() -> Unit): Onnx.TypeProto { - return Onnx.TypeProto.newBuilder().apply(block).build() -} - -fun GraphProto(block: Onnx.GraphProto.Builder.() -> Unit): Onnx.GraphProto { - return Onnx.GraphProto.newBuilder() - .apply(block).build() -} - -fun OnnxDim(block: Onnx.TensorShapeProto.Dimension.Builder.() -> Unit): Onnx.TensorShapeProto.Dimension { - return Onnx.TensorShapeProto.Dimension.newBuilder().apply(block).build() -} - - -fun Onnx.TensorShapeProto.Builder.OnnxShape(dims: List) { - this.addAllDim(dims.map { inputDim -> OnnxDim { - dimValue = inputDim - } }) -} - - -fun OnnxShapeProto(block: Onnx.TensorShapeProto.Builder.() -> Unit): Onnx.TensorShapeProto { - return Onnx.TensorShapeProto.newBuilder().apply(block).build() -} - -fun ValueInfoProto(block: Onnx.ValueInfoProto.Builder.() -> Unit): Onnx.ValueInfoProto { - return Onnx.ValueInfoProto.newBuilder() - .apply(block).build() -} - -fun Onnx.GraphProto.Builder.Output(input: Onnx.ValueInfoProto) { - this.addOutput(input) -} - - -fun Onnx.GraphProto.Builder.Input(input: Onnx.ValueInfoProto) { - this.addInput(input) -} - -fun Onnx.GraphProto.Builder.Node(inputNode: Onnx.NodeProto) { - this.addNode(inputNode) -} - -fun Onnx.AttributeProto.Builder.Tensor(inputTensor: Onnx.TensorProto) { - this.addTensors(inputTensor) -} - -fun OnnxTensorProto(block: Onnx.TensorProto.Builder.() -> Unit): Onnx.TensorProto { - return Onnx.TensorProto.newBuilder().apply { block }.build() -} - -fun Onnx.TensorProto.Builder.OnnxDataType(value: Onnx.TensorProto.DataType) { - this.dataType = value -} - -fun Onnx.TensorProto.Builder.OnnxRawData(byteArray: ByteArray) { - this.rawData = ByteString.copyFrom(byteArray) -} - -fun Onnx.TensorProto.Builder.Shape(shape: List) { - this.dimsList.clear() - this.dimsList.addAll(shape) -} - -fun Onnx.TensorProto.Builder.LongData(longData: List) { - this.addAllInt64Data(longData) -} - -fun Onnx.TensorProto.Builder.IntData(intData: List) { - this.addAllInt32Data(intData) -} - -fun Onnx.TensorProto.Builder.FloatData(floatData: List) { - this.addAllFloatData(floatData) -} - - -fun Onnx.TensorProto.Builder.DoubleData(doubleData: List) { - this.addAllDoubleData(doubleData) -} - -fun Onnx.TensorProto.Builder.StringData(stringData: List) { - this.addAllStringData(stringData.map { ByteString.copyFrom(it.toByteArray(Charset.defaultCharset())) }) -} - -fun Onnx.TensorProto.Builder.BoolData(boolData: List) { - this.addAllInt32Data(boolData.map { input -> if(input) 1 else 0 }) -} - - -fun createValueInfoFromTensor(arr: INDArray,valueInfoName: String,useShape: Boolean = true): Onnx.ValueInfoProto { - if(useShape) - return ValueInfoProto { - name = valueInfoName - type = TypeProto { - tensorType = TensorDefinition { - elemType = convertToOnnxDataType(arr.dataType()) - shape = OnnxShapeProto { - OnnxShape(arr.shape().toList()) - } - } - - } - } - else - return ValueInfoProto { - name = valueInfoName - type = TypeProto { - tensorType = TensorDefinition { - elemType = convertToOnnxDataType(arr.dataType()) - } - - } - } -} \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxRuleDeclarations.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxRuleDeclarations.kt deleted file mode 100644 index 6931b661f..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/onnx/OnnxRuleDeclarations.kt +++ /dev/null @@ -1,1327 +0,0 @@ -package org.nd4j.codegen.ir.onnx - -import onnx.Onnx -import org.nd4j.codegen.ir.* -import org.nd4j.ir.OpNamespace -import org.nd4j.ir.TensorNamespace - -class NDArrayMappingRule(mappingNamesToPerform: MutableMap, - transformerArgs: Map> = emptyMap()): - BaseNDArrayMappingRule(mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - - - override fun createTensorProto(input: Onnx.TensorProto): TensorNamespace.TensorProto { - return OnnxIRTensor(input).toArgTensor() - } - - override fun isInputTensorName(inputName: String): Boolean { - val onnxOp = onnxops.first { opDef -> opDef.name == mappingProcess!!.inputFrameworkOpName() } - return onnxOp.inputList.contains(inputName) - } - - override fun isOutputTensorName(outputName: String): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess!!.opName()) - return nd4jOpDescriptor.argDescriptorList.filter { inputDescriptor -> inputDescriptor.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - .map {inputDescriptor -> inputDescriptor.name }.contains(outputName) - } -} - -fun mappingNDArrayInputs(inputs: MutableMap) : NDArrayMappingRule { - return NDArrayMappingRule( - mappingNamesToPerform = inputs) -} - - - - -class OnnxMultiInputIndexMappingRule(mappingNamesToPerform: MutableMap, - transformerArgs: Map> = emptyMap()): - MultiInputIndexMappingRule(mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - - - override fun createTensorProto(input: Onnx.TensorProto): TensorNamespace.TensorProto { - return OnnxIRTensor(input).toArgTensor() - } - - override fun isInputTensorName(inputName: String): Boolean { - val onnxOp = onnxops.first { opDef -> opDef.name == mappingProcess!!.inputFrameworkOpName() } - return onnxOp.inputList.contains(inputName) - } - - override fun isOutputTensorName(outputName: String): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess!!.opName()) - return nd4jOpDescriptor.argDescriptorList.filter { inputDescriptor -> inputDescriptor.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - .map {inputDescriptor -> inputDescriptor.name }.contains(outputName) - } -} - -fun mappingListNDArrays(inputs: MutableMap) : OnnxMultiInputIndexMappingRule { - return OnnxMultiInputIndexMappingRule( - mappingNamesToPerform = inputs) -} - -class OnnxConditionalFieldValueIntIndexNDArrayRule - (mappingNamesToPerform: MutableMap, transformerArgs: Map>) : - ConditionalFieldValueIntIndexNDArrayRule - (mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun conditionalFieldValueIntIndexNDArrayRule(outputAttribute: String, - inputFrameworkAttributeName: String, - targetValue: String, - trueIndex: Int, - falseIndex: Int, - argumentIndex: Int): OnnxConditionalFieldValueIntIndexNDArrayRule { - return OnnxConditionalFieldValueIntIndexNDArrayRule( - mappingNamesToPerform = mutableMapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf(ArgDescriptor { - name = "targetValue" - stringValue = targetValue - argIndex = argumentIndex - }, - ArgDescriptor { - name = "trueIndex" - int32Value = trueIndex - argIndex = argumentIndex - }, - ArgDescriptor { - name = "falseIndex" - int32Value = falseIndex - argIndex = argumentIndex - })) - ) -} - - -class OnnxNDArrayExtractScalarValue(mappingNamesToPerform: MutableMap, - transformerArgs: Map> = emptyMap()): - NDArrayExtractScalarValue(mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun ndarrayExtractScalarValue(outputAttribute: String, - inputFrameworkAttributeName: String, - argumentIndex: Int, - scalarIndex: Int) : OnnxNDArrayExtractScalarValue { - return OnnxNDArrayExtractScalarValue( - mappingNamesToPerform = mutableMapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf( - ArgDescriptor { - name = inputFrameworkAttributeName - int64Value = scalarIndex.toLong() - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - argIndex = argumentIndex - }))) -} - - - -class OnnxConditionalFieldValueIntIndexArrayRule - (mappingNamesToPerform: MutableMap, transformerArgs: Map>) : - ConditionalFieldValueIntIndexArrayRule - (mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun conditionalFieldValueIntIndexArrayRule(outputAttribute: String, - inputFrameworkAttributeName: String, - targetValue: String, - trueIndex: Int, - falseIndex: Int, - argumentIndex: Int): OnnxConditionalFieldValueIntIndexArrayRule { - return OnnxConditionalFieldValueIntIndexArrayRule( - mappingNamesToPerform = mutableMapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf(ArgDescriptor { - name = "targetValue" - stringValue = targetValue - argIndex = argumentIndex - }, - ArgDescriptor { - name = "trueIndex" - int32Value = trueIndex - argIndex = argumentIndex - }, - ArgDescriptor { - name = "falseIndex" - int32Value = falseIndex - argIndex = argumentIndex - })) - ) -} - -class OnnxValueMapping(mappingNamesToPerform: Map, transformerArgs: Map>) : ValueMapping(mappingNamesToPerform, transformerArgs) { - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef,attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun valueMappings(mappings: Map): OnnxValueMapping { - return OnnxValueMapping(mappingNamesToPerform = mappings,transformerArgs = emptyMap()) -} - -////NDArrayAttributeToNDArrayInput -class OnnxNDArrayAttributeToNDArrayInput(mappingNamesToPerform: Map, transformerArgs: Map>) : NDArrayAttributeToNDArrayInput(mappingNamesToPerform, transformerArgs) { - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef,attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun ndarrayAttributeToNDArrayInput(mappings: Map): OnnxNDArrayAttributeToNDArrayInput { - return OnnxNDArrayAttributeToNDArrayInput(mappingNamesToPerform = mappings,transformerArgs = emptyMap()) -} - -class OnnxInvertBooleanNumber(mappingNamesToPerform: Map, transformerArgs: Map>) : InvertBooleanNumber(mappingNamesToPerform, transformerArgs) { - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef,attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -/** - * This will change a boolean to a number and a number to a boolean - */ -fun invertBooleanNumber(mappings: Map): OnnxInvertBooleanNumber { - return OnnxInvertBooleanNumber(mappingNamesToPerform = mappings,transformerArgs = emptyMap()) -} - - - - - -class OnnxDataTypeToInt(mappingNamesToPerform: Map, transformerArgs: Map>) : DataTypeToInt(mappingNamesToPerform, transformerArgs) { - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef,attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun dataTypeToInt(mappings: Map): OnnxDataTypeToInt { - return OnnxDataTypeToInt(mappingNamesToPerform = mappings,transformerArgs = emptyMap()) -} - - - -class OnnxNDArraySizeAt(mappingNamesToPerform: Map, transformerArgs: Map>): NDArraySizeAtRule(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): - IRAttribute { - return OnnxIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun sizeAtRule(dimensionIndex: Int, outputAttributeName: String, inputFrameworkAttributeName: String,argumentIndex: Int): OnnxNDArraySizeAt { - return OnnxNDArraySizeAt( - mappingNamesToPerform = mapOf(outputAttributeName to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttributeName to listOf(ArgDescriptor { - name = inputFrameworkAttributeName - int32Value = dimensionIndex - argIndex = argumentIndex - })) - ) -} - -class OnnxStringEqualsAdapterRule(mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()) : - StringEqualsAdapterRule - ( mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): - List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun stringEqualsRule(outputAttribute: String, inputFrameworkAttributeName: String, valueToTest: String,argumentIndex: Int): OnnxStringEqualsAdapterRule { - return OnnxStringEqualsAdapterRule( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf(ArgDescriptor { - name = inputFrameworkAttributeName - stringValue = valueToTest - argIndex = argumentIndex - }))) -} - - -class OnnxStringContainsAdapterRule(mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()) : - StringContainsAdapterRule - ( mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun stringContainsRule(outputAttribute: String, inputFrameworkAttributeName: String, valueToTest: String,argumentIndex: Int): OnnxStringContainsAdapterRule { - return OnnxStringContainsAdapterRule( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf(ArgDescriptor { - name = inputFrameworkAttributeName - stringValue = valueToTest - argIndex = argumentIndex - }))) -} - - - -class OnnxStringNotEqualsAdapterRule(mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()) : - StringNotEqualsAdapterRule - ( mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): - List> { - TODO("Not yet implemented") - } - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun stringNotEqualsRule(outputAttribute: String, inputFrameworkAttributeName: String, valueToTest: String,argumentIndex: Int): OnnxStringNotEqualsAdapterRule { - return OnnxStringNotEqualsAdapterRule( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf(ArgDescriptor { - name = inputFrameworkAttributeName - stringValue = valueToTest - argIndex = argumentIndex - }))) -} - - - -class OnnxNDArrayToIntAttributeValue(mappingNamesToPerform: Map) : NDArrayToIntAttributeValue(mappingNamesToPerform = mappingNamesToPerform,transformerArgs = emptyMap()) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef,attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun ndarrayToIntList(ndarrayNameToAttributeName: MutableMap): OnnxNDArrayToIntAttributeValue { - return OnnxNDArrayToIntAttributeValue(mappingNamesToPerform = ndarrayNameToAttributeName) -} - -class OnnxSizeThresholdIntArrayIntIndexRule(mappingNamesToPerform: Map, - transformerArgs: Map>) : SizeThresholdIntArrayIntIndexRule(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun sizeThreshold(outputAttribute: String, inputFrameworkAttributeName: String, sizeThreshold: Long, index: Long, fallbackIndex: Long,argumentIndex: Int): OnnxSizeThresholdIntArrayIntIndexRule { - return OnnxSizeThresholdIntArrayIntIndexRule(mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf( - ArgDescriptor { - name = "index" - int64Value = index - argIndex = argIndex - }, - ArgDescriptor { - name = "sizeThreshold" - int64Value = sizeThreshold - argIndex = argIndex - }, - ArgDescriptor { - name = "fallbackIndex" - int64Value = fallbackIndex - argIndex = argumentIndex - }))) -} - - -class OnnxStringToIndex(mappingNamesToPerform: Map, transformerArgs: Map>) : StringToInt(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun stringToIndex(outputAttributeValue: String, inputAttributeValue: String, listOfValues: List,argumentIndex: Int): OnnxStringToIndex { - return OnnxStringToIndex(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = - mapOf(outputAttributeValue to listOfValues.map { - valueName -> ArgDescriptor { - name = outputAttributeValue - stringValue = valueName - argIndex = argumentIndex - } - })) -} - - - -class OnnxMapStringToInt(mappingNamesToPerform: Map, transformerArgs: Map>) : MapStringToInt(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun mapStringToInt(outputAttributeValue: String, inputAttributeValue: String, mapOfValuesToInts: Map,argumentIndex: Int,lookupIndex: Int): OnnxMapStringToInt { - return OnnxMapStringToInt(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = - mapOf(outputAttributeValue to mapOfValuesToInts.map { - entry -> ArgDescriptor { - name = entry.key - int64Value = entry.value.toLong() - argIndex = argumentIndex - } - },"index" to listOf(ArgDescriptor { - name = "index" - int64Value = lookupIndex.toLong() - }))) -} - - -class OnnxListAttributeValueLookupToIndex(mappingNamesToPerform: Map, transformerArgs: Map>) : ListAttributeValueLookupToIndex(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun listAttributeValueLookup(outputAttributeValue: String, inputAttributeValue: String, indexValue: Int,argumentIndex: Int): OnnxListAttributeValueLookupToIndex { - return OnnxListAttributeValueLookupToIndex(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue), - transformerArgs = mapOf(outputAttributeValue to listOf(ArgDescriptor { - name = inputAttributeValue - int64Value = indexValue.toLong() - argIndex = argumentIndex - }) - )) -} - -class OnnxListNumberToListNumber(mappingNamesToPerform: Map, transformerArgs: Map>) : ListNumberToListNumber(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun listNumberToListNumber(outputAttributeValue: String, inputAttributeValue: String): OnnxListNumberToListNumber { - return OnnxListNumberToListNumber(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - - -class OnnxStringAttributeToNDArray(mappingNamesToPerform: Map, transformerArgs: Map>) : - StringAttributeToNDArray(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun convertStringToNDArray(outputAttributeValue: String, inputAttributeValue: String): OnnxStringAttributeToNDArray { - return OnnxStringAttributeToNDArray(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - - -class OnnxAttributeNumberListNDArray(mappingNamesToPerform: Map, transformerArgs: Map>) : - AttributeNumberListNDArray(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun convertNumericalListToNDArray(outputAttributeValue: String, inputAttributeValue: String): OnnxAttributeNumberListNDArray { - return OnnxAttributeNumberListNDArray(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - - -//ListNumberToNDArray - - -class OnnxListNumberToNDArray(mappingNamesToPerform: Map, transformerArgs: Map>) : ListNumberToNDArray(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun listNumberToNDarray(outputAttributeValue: String, inputAttributeValue: String): OnnxListNumberToNDArray { - return OnnxListNumberToNDArray(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - - - - - -class OnnxNDArrayInputToNumericalAttribute(mappingNamesToPerform: Map, transformerArgs: Map>) : NDArrayInputToNumericalAttribute(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun convertNDArrayInputToScalarAttr(outputAttributeValue: String, inputAttributeValue: String): OnnxNDArrayInputToNumericalAttribute { - return OnnxNDArrayInputToNumericalAttribute(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - - -class OnnxAttributeNDArrayToScalarAttribute(mappingNamesToPerform: Map, transformerArgs: Map>) : AttributeNDArrayToScalarAttribute(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun ndarrayAttributeToScalarAttribute(outputAttributeValue: String, inputAttributeValue: String): OnnxAttributeNDArrayToScalarAttribute { - return OnnxAttributeNDArrayToScalarAttribute(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - -class OnnxAttributeScalarNDArrayAttribute(mappingNamesToPerform: Map, transformerArgs: Map>) : AttributeScalarNDArrayAttribute(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } - -} - -fun attributeScalarToNDArrayInput(outputAttributeValue: String, inputAttributeValue: String): OnnxAttributeScalarNDArrayAttribute { - return OnnxAttributeScalarNDArrayAttribute(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - - - - -class OnnxArgDescriptorConstant(mappingNamesToPerform: Map, transformerArgs: Map>) : ArgDescriptorConstant(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: Onnx.AttributeProto, attributeValueType: Onnx.AttributeProto): IRAttribute { - return OnnxIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - - - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxTensorName(name,onnxOp) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return isOnnxAttributeName(name,onnxOp) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val onnxOp = onnxops.find { op -> op.name == mappingProcess.inputFrameworkOpName() }!! - return onnxAttributeTypeFor(name,onnxOp) - } -} - -fun argDescriptorConstant(argDescriptorConstants: List): OnnxArgDescriptorConstant { - return OnnxArgDescriptorConstant(mappingNamesToPerform = emptyMap(),transformerArgs = mapOf("value" to argDescriptorConstants)) -} - - - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/registry/ObjectRegistryHolder.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/registry/ObjectRegistryHolder.kt deleted file mode 100644 index e41039cfa..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/registry/ObjectRegistryHolder.kt +++ /dev/null @@ -1,70 +0,0 @@ -package org.nd4j.codegen.ir.registry - -import onnx.Onnx -import org.apache.commons.collections4.multimap.HashSetValuedHashMap -import org.nd4j.codegen.ir.MappingProcess -import org.nd4j.shade.protobuf.GeneratedMessageV3 -import org.nd4j.shade.protobuf.ProtocolMessageEnum -import org.tensorflow.framework.* - -object OpRegistryHolder { - - private val registeredOps = HashSetValuedHashMap>() - private val opDefLists = HashMap>() - - fun opMappingRegistryForName(name: String) : OpMappingRegistry{ - return registeredOps[name].first() as OpMappingRegistry - - } - - - fun onnx(): OpMappingRegistry { - return registeredOps["onnx"].first() as OpMappingRegistry - } - - fun tensorflow(): OpMappingRegistry { - return registeredOps["tensorflow"].first() as OpMappingRegistry - } - - fun opListForFramework(frameworkName: String): Map { - return opDefLists[frameworkName] as Map - } - - fun registerOpList(inputFrameworkName: String,opDefMap: Map) { - opDefLists[inputFrameworkName] = opDefMap - - } - - fun registerOpMappingRegistry(framework: String, registry: OpMappingRegistry) { - registeredOps.put(framework,registry) - } - - fun - registerMappingProcess(inputFrameworkOpName: String, processToRegister: MappingProcess) { - registeredOps.put(inputFrameworkOpName,processToRegister as OpMappingRegistry) - } - - fun - lookupOpMappingProcess(inputFrameworkName: String, inputFrameworkOpName: String): - MappingProcess { - val mappingRegistry = registeredOps[inputFrameworkName].first() - val lookup = mappingRegistry.lookupOpMappingProcess(inputFrameworkOpName = inputFrameworkOpName) - return lookup as MappingProcess - } -} diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/registry/OpMappingRegistry.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/registry/OpMappingRegistry.kt deleted file mode 100644 index c389ca6d5..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/registry/OpMappingRegistry.kt +++ /dev/null @@ -1,151 +0,0 @@ -package org.nd4j.codegen.ir.registry - -import org.apache.commons.collections4.MultiSet -import org.nd4j.shade.protobuf.GeneratedMessageV3 -import org.nd4j.shade.protobuf.ProtocolMessageEnum -import org.apache.commons.collections4.MultiValuedMap -import org.apache.commons.collections4.multimap.HashSetValuedHashMap -import org.apache.commons.io.FileUtils -import org.nd4j.codegen.ir.MappingProcess -import org.nd4j.codegen.ir.findOp -import org.nd4j.codegen.ir.nd4jOpDescriptors -import org.nd4j.ir.MapperNamespace -import org.nd4j.ir.OpNamespace -import java.io.File -import java.lang.IllegalArgumentException -import java.nio.charset.Charset - - -class OpMappingRegistry(inputFrameworkName: String) { - - val registeredOps: MultiValuedMap> = HashSetValuedHashMap< - String,MappingProcess>() - - val opDefList = HashMap() - val nd4jOpDefs = HashMap() - val inputFrameworkName = inputFrameworkName - - - - fun mappedNd4jOpNames(): Set { - return registeredOps.values().map { input -> input.opName() }.toSortedSet()!! - } - - fun mappingProcessNames(): MultiSet { - return registeredOps.keys()!! - } - - fun nd4jOpNames(): Set { - return nd4jOpDefs.keys - } - - fun inputFrameworkOpNames(): Set { - return opDefList.keys - } - - fun lookupNd4jOpDef(name:String): OpNamespace.OpDescriptor { - return nd4jOpDefs[name]!! - } - - fun registerOpDefs(opDefList: Map) { - opDefList.forEach { (name,inputOpDef) -> - registerInputFrameworkOpDef(name,inputOpDef) - } - } - - fun registerNd4jOpDef(name:String, opDef: OpNamespace.OpDescriptor) { - nd4jOpDefs[name] = opDef - } - - fun lookupInputFrameworkOpDef(name:String): OP_DEF_TYPE { - if(opDefList.isEmpty()) { - val opList = OpRegistryHolder.opListForFramework(inputFrameworkName) - opList.forEach { name,opDefType -> - opDefList[name] = opDefType - } - } - return opDefList[name]!! - } - - fun registerInputFrameworkOpDef(name: String,opDef: OP_DEF_TYPE) { - opDefList[name] = opDef - } - - fun registerMappingProcess(inputFrameworkOpName: String, processToRegister: MappingProcess) { - registeredOps.put(inputFrameworkOpName,processToRegister) - } - - fun hasMappingOpProcess(inputFrameworkOpName: String): Boolean { - return registeredOps.containsKey(inputFrameworkOpName) - } - - - fun lookupOpMappingProcess(inputFrameworkOpName: String): MappingProcess< - GRAPH_TYPE, - OP_DEF_TYPE, - NODE_TYPE, - TENSOR_TYPE, - ATTRIBUTE_TYPE, - ATTRIBUTE_VALUE_TYPE, - DATA_TYPE> { - if(!registeredOps.containsKey(inputFrameworkOpName)) { - throw IllegalArgumentException("No import process defined for $inputFrameworkOpName") - } - return registeredOps[inputFrameworkOpName]!!.first() - } - - fun opTypeForName(nd4jOpName: String): OpNamespace.OpDescriptor.OpDeclarationType { - val descriptor = nd4jOpDescriptors.findOp(nd4jOpName) - return descriptor.opDeclarationType - } - - /** - * TODO: Make loading op mapping rules (both tensor and attribute), input framework op definitions casted as - * OP_DEF_TYPE and op descriptors file. - * - * TODO: Get rid of static global constants (onnxops,tensorflow ops) - * TODO: See if possible to genericize lists of ops - */ - fun loadFromFile(inputFrameworkOpDefsName: String,nd4jOpDescriptorsFile: String,inputFrameworkName: String,mapperDeclarationsFile: String) { - //Nd4j op descriptors: OpNamespace.OpDescriptorList TextFormat.merge(..) - // - } - - fun saveProcessesAndRuleSet() { - val mapperDeclarations = ArrayList() - val bufferToWrite = StringBuilder() - registeredOps.asMap().forEach { name, listOfMappingProcesses -> - listOfMappingProcesses.forEach { mappingProcess -> - mapperDeclarations.add(mappingProcess.serialize()) - } - - mapperDeclarations.map { input -> input.toString() }.forEach { processString -> - bufferToWrite.append(processString + "\n") - } - - } - - FileUtils.write(File("$inputFrameworkName-processes.pbtxt"),bufferToWrite.toString(), Charset.defaultCharset()) - - } - -} - - - - - - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/NDArrayMappingRule.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/NDArrayMappingRule.kt deleted file mode 100644 index 7198b62a0..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/NDArrayMappingRule.kt +++ /dev/null @@ -1,68 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import org.nd4j.codegen.ir.BaseNDArrayMappingRule -import org.nd4j.codegen.ir.MultiInputIndexMappingRule -import org.nd4j.codegen.ir.findOp -import org.nd4j.codegen.ir.nd4jOpDescriptors -import org.nd4j.ir.OpNamespace -import org.nd4j.ir.TensorNamespace -import org.tensorflow.framework.* - -class NDArrayMappingRule(mappingNamesToPerform: MutableMap, - transformerArgs: Map> = emptyMap()): - BaseNDArrayMappingRule(mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - - - override fun createTensorProto(input: TensorProto): TensorNamespace.TensorProto { - return TensorflowIRTensor(input).toArgTensor() - } - - - override fun isInputTensorName(inputName: String): Boolean { - val tfOp = tensorflowOps.findOp(mappingProcess!!.inputFrameworkOpName()) - return tfOp.inputArgList.map { input -> input.name }.contains(inputName) - } - - override fun isOutputTensorName(outputName: String): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess!!.opName()) - return nd4jOpDescriptor.argDescriptorList.filter { inputDescriptor -> inputDescriptor.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - .map {inputDescriptor -> inputDescriptor.name }.contains(outputName) - } -} - -fun mappingNDArrayInputs(inputs: MutableMap) : NDArrayMappingRule { - return NDArrayMappingRule( - mappingNamesToPerform = inputs - ) -} - -//MultiInputIndexMappingRule -class TensorflowMultiInputIndexMappingRule(mappingNamesToPerform: MutableMap, - transformerArgs: Map> = emptyMap()): - MultiInputIndexMappingRule(mappingNamesToPerform = mappingNamesToPerform, transformerArgs = transformerArgs) { - - - - override fun createTensorProto(input: TensorProto): TensorNamespace.TensorProto { - return TensorflowIRTensor(input).toArgTensor() - } - - - override fun isInputTensorName(inputName: String): Boolean { - val tfOp = tensorflowOps.findOp(mappingProcess!!.inputFrameworkOpName()) - return tfOp.inputArgList.map { input -> input.name }.contains(inputName) - } - - override fun isOutputTensorName(outputName: String): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess!!.opName()) - return nd4jOpDescriptor.argDescriptorList.filter { inputDescriptor -> inputDescriptor.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - .map {inputDescriptor -> inputDescriptor.name }.contains(outputName) - } -} - -fun mappingListNDArrays(inputs: MutableMap) : TensorflowMultiInputIndexMappingRule { - return TensorflowMultiInputIndexMappingRule( - mappingNamesToPerform = inputs - ) -} \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TFOpAuxillaryFunctions.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TFOpAuxillaryFunctions.kt deleted file mode 100644 index 0c05cc9e5..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TFOpAuxillaryFunctions.kt +++ /dev/null @@ -1,218 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import org.nd4j.codegen.ir.ArgDescriptor -import org.nd4j.codegen.ir.AttributeMappingRule -import org.nd4j.imports.graphmapper.tf.TFGraphMapper -import org.nd4j.ir.OpNamespace -import org.nd4j.linalg.api.ndarray.INDArray -import org.tensorflow.framework.* - -fun convertNDArrayToTensorflowTensor(arrayToConvert: INDArray): TensorProto { - if(arrayToConvert.data() == null) - return TensorProto.getDefaultInstance() - when(arrayToConvert.dataType()) { - org.nd4j.linalg.api.buffer.DataType.FLOAT -> { - return TensorProto { - FloatData(arrayToConvert.data().asFloat().toList()) - Shape(arrayToConvert.shape().toList()) - dtype = DataType.DT_FLOAT - } - } - org.nd4j.linalg.api.buffer.DataType.INT32 -> { - return TensorProto { - Int32Data(arrayToConvert.data().asInt().toList()) - Shape(arrayToConvert.shape().toList()) - dtype = DataType.DT_INT32 - } - } - org.nd4j.linalg.api.buffer.DataType.INT64 -> { - return TensorProto { - Int64Data(arrayToConvert.data().asLong().toList()) - Shape(arrayToConvert.shape().toList()) - dtype = DataType.DT_INT64 - } - } - org.nd4j.linalg.api.buffer.DataType.DOUBLE -> { - return TensorProto { - DoubleData(arrayToConvert.data().asDouble().toList()) - Shape(arrayToConvert.shape().toList()) - dtype = DataType.DT_DOUBLE - } - } - else -> { - return TensorProto { - dtype = convertNd4jDataTypeToTensorflow(arrayToConvert.dataType()) - RawData(arrayToConvert.data().asBytes()) - Shape(arrayToConvert.shape().toList()) - - } - } - - } -} - -fun convertNd4jDataTypeToTensorflow(dataType: org.nd4j.linalg.api.buffer.DataType) : DataType { - when(dataType) { - org.nd4j.linalg.api.buffer.DataType.DOUBLE -> return DataType.DT_DOUBLE - org.nd4j.linalg.api.buffer.DataType.FLOAT16 -> return DataType.DT_HALF - org.nd4j.linalg.api.buffer.DataType.FLOAT -> return DataType.DT_FLOAT - org.nd4j.linalg.api.buffer.DataType.INT32 -> return DataType.DT_INT32 - org.nd4j.linalg.api.buffer.DataType.UINT32 -> return DataType.DT_UINT32 - org.nd4j.linalg.api.buffer.DataType.INT64 -> return DataType.DT_INT64 - org.nd4j.linalg.api.buffer.DataType.UINT64 -> return DataType.DT_UINT64 - org.nd4j.linalg.api.buffer.DataType.BOOL -> return DataType.DT_BOOL - org.nd4j.linalg.api.buffer.DataType.INT8 -> return DataType.DT_INT8 - org.nd4j.linalg.api.buffer.DataType.INT16 -> return DataType.DT_INT16 - org.nd4j.linalg.api.buffer.DataType.BFLOAT16 -> return DataType.DT_BFLOAT16 - org.nd4j.linalg.api.buffer.DataType.UTF8 -> return DataType.DT_STRING - else -> { - return DataType.UNRECOGNIZED - } - } -} - -fun booleanConstant(inputName: String, constantValue: Boolean,argumentIndex: Int): List { - return listOf(argDescriptorConstant(listOf( - ArgDescriptor { - name = inputName - boolValue = constantValue - argType = OpNamespace.ArgDescriptor.ArgType.BOOL - argIndex = argumentIndex - } - ))) -} - -fun doubleConstant(inputName: String, constantValue: Double, argumentIndex: Int): List { - return listOf(argDescriptorConstant(listOf( - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - name = inputName - doubleValue = constantValue - argIndex = argumentIndex - } - ))) -} - -fun intConstant(inputName: String, constantValue: Integer, argumentIndex: Int): List { - return listOf(argDescriptorConstant(listOf( - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = inputName - int64Value = constantValue.toLong() - argIndex = argumentIndex - } - ))) -} - -fun mapSameName(names: List): List { - return listOf(mappingNDArrayInputs(names.map { name -> Pair(name, name) }.toMap().toMutableMap())) -} - -fun mapTensorNamesWithOp(inputFrameworkOpName: String, - opName: String, - tensorNames: MutableMap, - attributeMappingRules: List> = emptyList()): TensorflowMappingProcess { - return TensorflowMappingProcess( - opName = opName, - inputFrameworkOpName = inputFrameworkOpName, - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(tensorNames)), - attributeMappingRules = attributeMappingRules - ) - -} - -fun multipleNameMapping(inputFrameworkOpNames: List, - opName: String, tensorNames: MutableMap, - attributeMappingRules: List> = emptyList()): - List { - return inputFrameworkOpNames.map { - mapTensorNamesWithOp( - inputFrameworkOpName = it, - opName = opName, - tensorNames = tensorNames, - attributeMappingRules = attributeMappingRules - ) - } -} - -fun defineBiasAdd(names :List = listOf("BiasAdd","BiasAddV1")) { - names.forEach { - TensorflowMappingProcess( - opName = "biasadd", - inputFrameworkOpName = it, - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "value", "bias" to "bias"))), - attributeMappingRules = booleanConstant(inputName = "nchw", constantValue = false, argumentIndex = 0) - - ) - } -} - -fun defineTensorflowSingleTransform(inputOpName: String, inputFrameworkOpName: String): TensorflowMappingProcess { - return TensorflowMappingProcess( - opName = inputOpName, - inputFrameworkOpName = inputFrameworkOpName, tensorMappingRules = listOf( - NDArrayMappingRule( - mappingNamesToPerform = mutableMapOf("input" to "x") - ) - ), - attributeMappingRules = listOf(argDescriptorConstant( - listOf( - ArgDescriptor { - name = "inPlace" - boolValue = false - argType = OpNamespace.ArgDescriptor.ArgType.BOOL - argIndex = 0 - } - ) - )), - opMappingRegistry = tensorflowOpRegistry) - -} - -fun defineSingularReduce(inputFrameworkOpName: String, inputOpName: String): TensorflowMappingProcess { - return mapTensorNamesWithOp( - inputFrameworkOpName = inputFrameworkOpName, - opName = inputOpName, - attributeMappingRules = listOf( - valueMapping(mutableMapOf("keepDims" to "keep_dims")), - ndarrayToIntList(mutableMapOf("dimensions" to "reduction_indices")) - ), - tensorNames = mutableMapOf("input" to "input") - ) -} - -fun definePairWiseReduce(inputFrameworkOpName: String, inputOpName: String): TensorflowMappingProcess { - return mapTensorNamesWithOp( - inputFrameworkOpName = inputFrameworkOpName, - opName = inputOpName, - attributeMappingRules = listOf( - valueMapping(mutableMapOf("keepDims" to "keep_dims")), - ndarrayToIntList(mutableMapOf("dimensions" to "reduction_indices")) - ), - tensorNames = mutableMapOf("input" to "input") - ) -} - -fun defineTensorflowPairwiseTransforms(opName: String, inputFrameworkOpName: String, - firstOutputName: String = "input", - secondOutputName: String = "y", - firstInput: String = "x", secondInput: String = "y") : TensorflowMappingProcess { - return TensorflowMappingProcess( - opName = opName, - tensorMappingRules = listOf( - NDArrayMappingRule( - mappingNamesToPerform = mutableMapOf( - firstOutputName to firstInput, - secondOutputName to secondInput - ) - ) - ), - inputFrameworkOpName = inputFrameworkOpName, - inputFramework = "tensorflow", - attributeMappingRules = booleanConstant(inputName = "inPlace", constantValue = false, argumentIndex = 0), - opMappingRegistry = tensorflowOpRegistry - ) -} - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowIR.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowIR.kt deleted file mode 100644 index 1aeb08601..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowIR.kt +++ /dev/null @@ -1,915 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import org.apache.commons.io.IOUtils -import org.nd4j.codegen.ir.* -import org.nd4j.common.io.ClassPathResource -import org.nd4j.imports.graphmapper.tf.tensors.TFTensorMappers -import org.nd4j.ir.OpNamespace -import org.nd4j.ir.TensorNamespace -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.shade.protobuf.TextFormat -import org.nd4j.tensorflow.conversion.graphrunner.GraphRunner -import org.tensorflow.framework.* -import org.tensorflow.framework.OpDef.AttrDef -import java.nio.charset.Charset -import kotlin.collections.HashMap -import kotlin.math.min - -fun loadTensorflowOps(): OpList { - val string = IOUtils.toString(ClassPathResource("ops.proto").inputStream, Charset.defaultCharset()) - val tfListBuilder = OpList.newBuilder() - TextFormat.merge(string, tfListBuilder) - return tfListBuilder.build() -} - -val tensorflowOps = loadTensorflowOps() - - - - -class TensorflowIRTensor(input: TensorProto): IRTensor { - - val tensor = input - - - override fun shape(): List { - return tensor.tensorShape.dimList.map { it.size } - - } - - override fun stride(): List { - return Nd4j.getStrides(shape().toTypedArray().toLongArray(), 'c').asList() - } - - override fun dataType(): IRDataType { - return TensorflowIRDataType(tensor.dtype) - } - - override fun toArgTensor(): TensorNamespace.TensorProto { - val builder = TensorNamespace.TensorProto.newBuilder() - .setDataLocation(TensorNamespace.TensorProto.DataLocation.DEFAULT) - - for(i in 0 until tensor.tensorShape.dimCount) { - builder.addDims(tensor.tensorShape.getDim(i).size) - } - - when(tensor.dtype) { - DataType.DT_UINT64 -> builder.dataType = TensorNamespace.DataType.UINT64.ordinal - DataType.DT_UINT32 -> builder.dataType = TensorNamespace.DataType.UINT32.ordinal - DataType.DT_UINT16 -> builder.dataType = TensorNamespace.DataType.UINT16.ordinal - DataType.DT_HALF -> builder.dataType = TensorNamespace.DataType.FLOAT16.ordinal - DataType.DT_STRING -> builder.dataType = TensorNamespace.DataType.STRING.ordinal - DataType.DT_FLOAT -> builder.dataType = TensorNamespace.DataType.FLOAT.ordinal - DataType.DT_DOUBLE -> builder.dataType = TensorNamespace.DataType.DOUBLE.ordinal - DataType.DT_BOOL -> builder.dataType = TensorNamespace.DataType.BOOL.ordinal - DataType.DT_INT64 -> builder.dataType = TensorNamespace.DataType.INT64.ordinal - DataType.DT_INT32 -> builder.dataType = TensorNamespace.DataType.INT32.ordinal - DataType.DT_INT16 -> builder.dataType = TensorNamespace.DataType.INT16.ordinal - DataType.DT_BFLOAT16 -> builder.dataType = TensorNamespace.DataType.BFLOAT16.ordinal - DataType.DT_COMPLEX64 -> builder.dataType = TensorNamespace.DataType.COMPLEX64.ordinal - DataType.DT_COMPLEX128 -> builder.dataType = TensorNamespace.DataType.COMPLEX128.ordinal - DataType.UNRECOGNIZED -> builder.dataType = TensorNamespace.DataType.UNRECOGNIZED.ordinal - - } - - - if(tensor.doubleValList != null && tensor.doubleValCount > 0) { - builder.addAllDoubleData(tensor.doubleValList) - } - - if(tensor.stringValList != null && tensor.stringValCount > 0) { - builder.addAllStringData(tensor.stringValList) - } - - if(tensor.floatValList != null && tensor.floatValCount > 0) { - builder.addAllFloatData(tensor.floatValList) - } - - if(tensor.intValList != null && tensor.intValCount > 0) { - builder.addAllInt32Data(tensor.intValList) - } - - if(tensor.uint64ValList != null && tensor.uint64ValCount > 0) { - builder.addAllInt64Data(tensor.uint64ValList) - } - - if(tensor.int64ValList != null && tensor.int64ValCount > 0) { - builder.addAllInt64Data(tensor.int64ValList) - } - - if(tensor.tensorContent != null) { - builder.rawData = tensor.tensorContent - } - - - return builder.build() - } - - override fun rawValue(): TensorProto { - return tensor - } - - override fun toNd4jNDArray(): INDArray { - if(tensor.dtype == DataType.UNRECOGNIZED || tensor.dtype == DataType.DT_INVALID) - return Nd4j.empty() - return TFTensorMappers.newMapper(tensor).toNDArray() - } -} - -class TensorflowIRDataType(inputDataType: DataType): IRDataType { - val dataType = inputDataType - - override fun convertToDataType(input: DataType): IRDataTypeValue { - when(input) { - DataType.DT_BOOL,DataType.DT_BOOL_REF -> return IRDataTypeValue.DT_BOOL - DataType.DT_BFLOAT16,DataType.DT_BFLOAT16_REF -> return IRDataTypeValue.DT_BFLOAT16 - DataType.DT_COMPLEX128,DataType.DT_COMPLEX128_REF -> return IRDataTypeValue.DT_COMPLEX128 - DataType.DT_COMPLEX64,DataType.DT_COMPLEX64_REF -> return IRDataTypeValue.DT_COMPLEX64 - DataType.DT_DOUBLE, DataType.DT_DOUBLE_REF -> return IRDataTypeValue.DT_DOUBLE - DataType.DT_FLOAT,DataType.DT_FLOAT_REF -> return IRDataTypeValue.DT_FLOAT - DataType.DT_HALF,DataType.DT_HALF_REF -> return IRDataTypeValue.DT_HALF - DataType.DT_INT16,DataType.DT_INT16_REF -> return IRDataTypeValue.DT_INT16 - DataType.DT_INT32,DataType.DT_INT32_REF -> return IRDataTypeValue.DT_INT32 - DataType.DT_INT64, DataType.DT_INT64_REF -> return IRDataTypeValue.DT_INT64 - DataType.DT_QINT8,DataType.DT_QINT8_REF -> return IRDataTypeValue.DT_QINT8 - DataType.DT_QINT16, DataType.DT_QINT16_REF -> return IRDataTypeValue.DT_QINT16 - DataType.DT_QINT32, DataType.DT_QINT32_REF -> return IRDataTypeValue.DT_QINT32 - DataType.DT_STRING,DataType.DT_STRING_REF -> return IRDataTypeValue.DT_STRING - DataType.DT_UINT16, DataType.DT_UINT16_REF -> return IRDataTypeValue.DT_UINT16 - DataType.DT_UINT32,DataType.DT_UINT32_REF -> return IRDataTypeValue.DT_UINT32 - DataType.DT_UINT64,DataType.DT_UINT64_REF -> return IRDataTypeValue.DT_UINT64 - - } - - return IRDataTypeValue.DT_INVALID - } - - - - override fun dataType(): IRDataTypeValue { - return convertToDataType(this.dataType) - } - - override fun internalValue(): DataType { - return this.dataType - } - - override fun nd4jDataType(): org.nd4j.linalg.api.buffer.DataType { - when(this.dataType) { - DataType.DT_BOOL,DataType.DT_BOOL_REF -> return org.nd4j.linalg.api.buffer.DataType.BOOL - DataType.DT_FLOAT,DataType.DT_FLOAT_REF -> return org.nd4j.linalg.api.buffer.DataType.FLOAT - DataType.DT_STRING, DataType.DT_STRING_REF -> return org.nd4j.linalg.api.buffer.DataType.UTF8 - DataType.DT_BFLOAT16,DataType.DT_BFLOAT16_REF -> return org.nd4j.linalg.api.buffer.DataType.BFLOAT16 - DataType.DT_INT64,DataType.DT_INT64_REF -> return org.nd4j.linalg.api.buffer.DataType.INT64 - DataType.DT_HALF,DataType.DT_HALF_REF -> return org.nd4j.linalg.api.buffer.DataType.FLOAT16 - DataType.DT_INT16,DataType.DT_INT16_REF -> return org.nd4j.linalg.api.buffer.DataType.INT16 - DataType.DT_INT32,DataType.DT_INT32_REF -> return org.nd4j.linalg.api.buffer.DataType.INT32 - DataType.DT_DOUBLE,DataType.DT_DOUBLE_REF -> return org.nd4j.linalg.api.buffer.DataType.DOUBLE - DataType.DT_UINT16, DataType.DT_UINT16_REF -> return org.nd4j.linalg.api.buffer.DataType.UINT16 - DataType.DT_UINT32,DataType.DT_UINT32_REF -> return org.nd4j.linalg.api.buffer.DataType.UINT32 - DataType.DT_UINT64, DataType.DT_UINT64_REF -> return org.nd4j.linalg.api.buffer.DataType.UINT64 - } - - return org.nd4j.linalg.api.buffer.DataType.UNKNOWN - } - - override fun nameSpaceDataType(): TensorNamespace.DataType { - when(this.dataType) { - DataType.DT_BOOL,DataType.DT_BOOL_REF -> return TensorNamespace.DataType.BOOL - DataType.DT_FLOAT,DataType.DT_FLOAT_REF -> return TensorNamespace.DataType.FLOAT - DataType.DT_STRING,DataType.DT_STRING_REF -> return TensorNamespace.DataType.STRING - DataType.DT_BFLOAT16,DataType.DT_BFLOAT16_REF -> return TensorNamespace.DataType.BFLOAT16 - DataType.DT_INT64, DataType.DT_INT64_REF -> return TensorNamespace.DataType.INT64 - DataType.DT_HALF,DataType.DT_HALF_REF-> return TensorNamespace.DataType.FLOAT16 - DataType.DT_INT16,DataType.DT_INT16_REF -> return TensorNamespace.DataType.INT16 - DataType.DT_INT32,DataType.DT_INT32_REF -> return TensorNamespace.DataType.INT32 - DataType.DT_DOUBLE,DataType.DT_DOUBLE_REF -> return TensorNamespace.DataType.DOUBLE - DataType.DT_UINT16,DataType.DT_UINT16_REF -> return TensorNamespace.DataType.UINT16 - DataType.DT_UINT32, DataType.DT_UINT32_REF -> return TensorNamespace.DataType.UINT32 - DataType.DT_UINT64,DataType.DT_UINT64_REF -> return TensorNamespace.DataType.UINT64 - } - - return TensorNamespace.DataType.UNDEFINED - } - -} - -fun attrDefaultValue(): IRAttribute { - return TensorflowIRAttr(AttrDef.getDefaultInstance(), AttrValue.getDefaultInstance()) -} - -class TensorflowIRAttr(inputAttributeDef: AttrDef, inputAttributeValue: AttrValue): IRAttribute { - - private val attributeDef = inputAttributeDef - private val attributeValue = inputAttributeValue - - override fun name(): String { - return attributeDef.name - } - - override fun floatValue(): Float { - return attributeValue.f - } - - override fun listFloatValue(): List { - return attributeValue.list.fList - } - - - override fun intValue(): Long { - return attributeValue.i - } - - override fun listIntValue(): List { - return attributeValue.list.iList - } - - override fun boolValue(): Boolean { - return attributeValue.b - } - - override fun listBoolValue(): List { - return attributeValue.list.bList - } - - override fun attributeValueType(): AttributeValueType { - when(attributeDef.type) { - "list(bool)" -> return AttributeValueType.LIST_BOOL - "bool" -> return AttributeValueType.BOOL - "string" -> return AttributeValueType.STRING - "list(string)" -> return AttributeValueType.LIST_STRING - "int" -> return AttributeValueType.INT - "list(int)" -> return AttributeValueType.LIST_INT - "float" -> return AttributeValueType.FLOAT - "list(float)" -> return AttributeValueType.LIST_FLOAT - "tensor" -> return AttributeValueType.TENSOR - "list(tensor)" -> return AttributeValueType.LIST_TENSOR - "type" -> return AttributeValueType.DATA_TYPE - } - - return AttributeValueType.INVALID - } - - - - override fun internalAttributeDef(): AttrDef { - return attributeDef - } - - override fun internalAttributeValue(): AttrValue { - return attributeValue - } - - override fun listTensorValue(): List> { - return attributeValue.list.tensorList.map { input -> TensorflowIRTensor(input) - } - } - - override fun tensorValue(): IRTensor { - return TensorflowIRTensor(attributeValue.tensor) - } - - override fun stringValue(): String { - return attributeValue.s.toStringUtf8() - } - - override fun listStringValue(): List { - return attributeValue.list.sList.map { it.toStringUtf8() } - } - - override fun dataTataTypeValue(): IRDataType { - return TensorflowIRDataType(attributeValue.type) - } - -} - -class TensorflowIRArgDef(input: OpDef.ArgDef): IRArgDef { - private val argDefValue = input - - override fun dataType(): IRDataType { - return TensorflowIRArgDef(argDefValue).dataType() - } - - override fun name(): String { - return argDefValue.name - } - - override fun description(): String { - return argDefValue.description - } - - override fun internalValue(): OpDef.ArgDef { - return argDefValue - } - - override fun indexOf(): Integer { - TODO("Not yet implemented") - } - -} - -class TensorflowIROp(input: OpDef): IROpDef { - - val opDef = input - - override fun attributes(): List> { - return opDef.attrList.map { - TensorflowIRAttr(it, AttrValue.getDefaultInstance()) - } - } - - override fun opName(): String { - return opDef.name - } - - override fun internalValue(): OpDef { - return opDef - } - - override fun inputArgs(): List> { - return opDef.inputArgList.map { - TensorflowIRArgDef(it) - } - } - - override fun outputArgs(): List> { - return opDef.outputArgList.map { - TensorflowIRArgDef(it) - } - } - -} - -class TensorflowIRNode(inputNode: NodeDef, inputOpDef: OpDef): IRNode { - - private val nodeDef = inputNode - private val opDef = inputOpDef - private val attrDefsMap = attrDefsByName(inputOpDef.attrList) - private val attrMap: Map> = initAttrMapFromNode(inputNode) - private val opDescriptor: OpNamespace.OpDescriptor - private val mappingProcess: MappingProcess = tensorflowOpRegistry.lookupOpMappingProcess(inputNode.op) - //private val inputs: List - //private val outputs: List - - init { - opDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - // inputs = opDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - // outputs = opDescriptor.argDescriptorList.filter { argDescriptor -> argDescriptor.argType == OpNamespace.ArgDescriptor.ArgType.OUTPUT_TENSOR } - - } - - private fun attrDefsByName(input: List): Map { - val ret = HashMap() - input.forEach { - ret[it.name] = it - } - return ret - } - - private fun initAttrMapFromNode(input: NodeDef): Map> { - val ret = HashMap>() - input.attrMap.forEach { (key, value) -> - ret[key] = TensorflowIRAttr(attrDefsMap.getOrDefault(key, AttrDef.getDefaultInstance()), value) - } - - return ret - } - - override fun opName(): String { - return nodeDef.op - } - - override fun nodeName(): String { - return nodeDef.name - } - - override fun inputAt(index: Int): String { - if(mappingProcess.indexOverrides().containsKey(index)) - return nodeDef.getInput(mappingProcess.indexOverrides()[index]!!) - return nodeDef.getInput(index) - } - - override fun outputAt(index: Int): String { - return opDef.outputArgList[index].name - } - - - - override fun hasAttribute(inputName: String): Boolean { - return nodeDef.containsAttr(inputName) - } - - override fun attributeMap(): Map> { - return attrMap - } - - override fun createInputsFrom(inputData: List): List> { - return inputData.map { TensorflowIRTensor(it) } - } - - override fun createOutputsFrom(inputValues: List): List> { - return inputValues.map { TensorflowIRTensor(it) } - } - - override fun getAttribute(inputName: String): IRAttribute { - return attrMap.getOrDefault(inputName, attrDefaultValue()) - } - - override fun internalValue(): NodeDef { - return nodeDef - } - - override fun numInputs(): Int { - return nodeDef.inputCount - } - - override fun numOutputs(): Int { - return opDef.outputArgCount - } - - override fun inputs(): List { - return nodeDef.inputList - } - - override fun outputs(): List { - return opDef.outputArgList.map { input -> input.name } - } - - /** - * Get the list of tensors given an OpDef name (note: this is no tthe name of the input, but instead the op name, we use this to look up - * the number attribute value and thus the number of inputs for a particular definition name.) - * Tensorflow also allows multiple sets of lists of tensors as inputs, so we need to make sure to disambiguate which list of inputs we are looking up. - */ - override fun numInputsForListOfTensors(name: String): Int { - return nodeDef.getAttrOrThrow(opDef.inputArgList.first { input -> input.name == name }.numberAttr).i.toInt() - } - - override fun inputNamesForListOfInputValues(inputListName: String): List { - val inputArgNames = opDef.inputArgList.map { argDef -> argDef.name } - val indexOfDef = inputArgNames.indexOf(inputListName) - if(indexOfDef < 0) - return emptyList() - var totalAmount: Long = 0 - for(i in 0 .. indexOfDef) { - if(opDef.getInputArg(i).numberAttr.isNotEmpty()) { - val numToAdd = nodeDef.getAttrOrDefault(opDef.getInputArg(i).numberAttr, AttrValue { - LongVal(1) - }).i - totalAmount += numToAdd - } - else - totalAmount++ - } - //note: this is inclusive - return nodeDef.inputList.subList(indexOfDef,totalAmount.toInt()) - } - - override fun computeAdjustedOffsetForInput( - nd4jName: String, - inputFrameworkName: String, - tensorInputMappings: Map - ): Int { - val baseIndex = lookupIndexForArgDescriptor( - argDescriptorName = nd4jName, - opDescriptorName = this.opDescriptor.name, - argDescriptorType = OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR - ) - - val inputs = opDescriptor.argDescriptorList.filter { input -> input.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR } - var totalAmount: Long = 0 - for(i in 0 until baseIndex) { - val nd4jNameAtIndex = inputs.first {descriptor -> descriptor.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR && descriptor.argIndex == i}.name - val inputFrameworkName = tensorInputMappings[nd4jNameAtIndex]!! - val totalNames = inputNamesForListOfInputValues(inputFrameworkName).size - totalAmount += totalNames - } - - if(totalAmount < 1) - return baseIndex - return (baseIndex + totalAmount.toInt()) - 1 - } - - override fun nd4jInputs(tensorMappings: Map): List { - val ret = ArrayList() - val indicesToNames = HashMap>() - tensorMappings.forEach { (nd4jName,inputFrameworkName) -> - val idx = computeAdjustedOffsetForInput(nd4jName,inputFrameworkName,tensorMappings) - val inputNamesForCurrInput = inputNamesForListOfInputValues(inputFrameworkName) - indicesToNames[idx] = inputNamesForCurrInput - } - - indicesToNames.toSortedMap().forEach { idx, names -> - ret.addAll(names.filter {!ret.contains(it)}) - } - - return ret - } - -} - - -class TensorflowIRGraphRunner(irGraph: TensorflowIRGraph,inputNames: List,outputNames: List): IRGraphRunner { - - val irGraph = irGraph - val graphRunner: GraphRunner - init { - graphRunner = GraphRunner.builder() - .graphBytes(irGraph.graphDef.toByteArray()) - .inputNames(inputNames) - .outputNames(outputNames) - .build() - } - - - override fun graph(): IRGraph { - return irGraph - } - - override fun run(inputs: Map): Map { - return graphRunner.run(inputs) - } - -} - -class TensorflowIRGraph(graphDef: GraphDef, opDef: OpList): IRGraph< - GraphDef, - NodeDef, - OpDef, - TensorProto, - AttrDef, - AttrValue, - DataType> { - - val graphDef = graphDef - val opDef = opDef - override fun nodeByName(input: String): NodeDef { - return graphDef.nodeByName(input) - } - - - override fun nodeList(): List> { - return graphDef.nodeList.map { - inputNode -> TensorflowIRNode(inputNode, tensorflowOps.findOp(inputNode.op)) - } - } - - override fun internalValue(): GraphDef { - return graphDef - } - - - - override fun createMappingContext( - opDef: OpDef, - node: NodeDef, - dynamicVariables: Map - ): MappingContext { - return TensorflowMappingContext(opDef = opDef,graph = this,node = node,dynamicVariables = dynamicVariables) - } - - override fun frameworkName(): String { - return "tensorflow" - } - - override fun nd4jNameForInternalOpName(name: String): String { - return tensorflowOpRegistry.lookupOpMappingProcess(name).opName() - } - - override fun isConstantOpName(name: String): Boolean { - return name == "Const" || name == "Placeholder" - } - - override fun isConstant(opName: String): Boolean { - return opName == "Const" - } - - override fun isPlaceHolder(opName: String): Boolean { - return opName == "Placeholder" || opName == "PlaceholderWithDefault" - } - - override fun shapeOfInput(varName: String): LongArray? { - val attrMap = nodeByName(varName).attrMap - val shapeAvailable = attrMap.containsKey("shape") - var shape: LongArray? - shape = if (shapeAvailable) { - attrMap["shape"]!!.list.iList.toLongArray() - - } else { - //Some placeholders don't have any shape restrictions - i.e., accept anything... - null - } - - return shape - } - - override fun dataTypeForVariable(varName: String): IRDataType { - val node = nodeByName(varName) - val attrMap = node.attrMap - if(!attrMap.containsKey("dtype")) { - val retSet = attrMap.values.filter { attrValue -> attrValue.type != DataType.DT_INVALID } - if(retSet.isEmpty()) { - return TensorflowIRDataType(DataType.DT_INVALID) - } else { - return TensorflowIRDataType(attrMap.values.filter { attrValue -> attrValue.type != DataType.DT_INVALID }.first().type) - } - } else if(attrMap.containsKey("dtype")) { - return TensorflowIRDataType(attrMap["dtype"]!!.type) - } - - return TensorflowIRDataType(DataType.DT_INVALID) - } - - override fun importInfoForEachNode(dynamicVariables: Map): Map, OpNamespace.OpDescriptor>> { - return importInfoForEachNodeInGraph(graph = this,dynamicVariables = dynamicVariables) - } - - override fun nodeIsPlaceHolder(nodeName: String): Boolean { - return isPlaceHolder(nodeByName(nodeName).op) - } - - -} - - - - -fun convertToDataType(dataType: org.nd4j.linalg.api.buffer.DataType): DataType { - return when (dataType) { - org.nd4j.linalg.api.buffer.DataType.UINT16 -> DataType.DT_UINT16 - org.nd4j.linalg.api.buffer.DataType.UINT32 -> DataType.DT_UINT32 - org.nd4j.linalg.api.buffer.DataType.UINT64 -> DataType.DT_UINT64 - org.nd4j.linalg.api.buffer.DataType.BOOL -> DataType.DT_BOOL - org.nd4j.linalg.api.buffer.DataType.BFLOAT16 -> DataType.DT_BFLOAT16 - org.nd4j.linalg.api.buffer.DataType.FLOAT -> DataType.DT_FLOAT - org.nd4j.linalg.api.buffer.DataType.INT -> DataType.DT_INT32 - org.nd4j.linalg.api.buffer.DataType.LONG -> DataType.DT_INT64 - org.nd4j.linalg.api.buffer.DataType.BYTE -> DataType.DT_INT8 - org.nd4j.linalg.api.buffer.DataType.SHORT -> DataType.DT_INT16 - org.nd4j.linalg.api.buffer.DataType.DOUBLE -> DataType.DT_DOUBLE - org.nd4j.linalg.api.buffer.DataType.UBYTE -> DataType.DT_UINT8 - org.nd4j.linalg.api.buffer.DataType.HALF -> DataType.DT_HALF - org.nd4j.linalg.api.buffer.DataType.UTF8 -> DataType.DT_STRING - else -> throw UnsupportedOperationException("Unknown TF data type: [" + dataType.name + "]") - } -} - - -class TensorflowMappingContext(opDef: OpDef, node: NodeDef, graph: IRGraph,dynamicVariables: Map) : - AbstractMappingContext(opDef, node, graph,dynamicVariables) { - - override fun attrDef(name: String): AttrDef { - if(opDef().attrCount < 1) { - throw IllegalArgumentException("No attributes found for op def with name ${opDef.name}") - } - - val ret = opDef().attrList.firstOrNull { it.name == name } ?: error("No attribute found with name $name") - return ret!! - } - - override fun irAttributeValueForNode(valueName: String): IRAttribute { - val attrDef = attrDef(valueName) - val attrValue = node.getAttrOrDefault(valueName, attrDef.defaultValue) - return TensorflowIRAttr(inputAttributeDef = attrDef, inputAttributeValue = attrValue) - - } - - override fun tensorInputFor(name: String): IRTensor { - var foundIndex = -1 - /** - * Use op definition name as 1 unified reference name in rules for static purposes, but - * look up via index for specific node mappings. - * - * This is equivalent to the tf input position attribute value in the previous tensorflow import. - */ - var baseIndexOffset: Int = 0 - opDef.inputArgList.forEachIndexed { index, argDef -> - if(argDef.numberAttr.isNotEmpty()) { - var totalNum = node.getAttrOrDefault(argDef.numberAttr,AttrValue { - i = 0 - }) - - baseIndexOffset += totalNum.i.toInt() - } - - if(argDef.name == name) - foundIndex = min(index + baseIndexOffset,node.inputCount - 1) - } - - - if(foundIndex < 0) { - throw java.lang.IllegalArgumentException("Node with name ${nodeName()} for opdef with name ${opDef.name} did not contain a tensor with name ${name}") - } - - val graphNode = node.getInput(foundIndex) - return tensorInputFromInputFrameworkName(graphNode) - } - - override fun opName(): String { - return node.op - } - - override fun nodeName(): String { - return node.name - } - - override fun nd4jDataTypeFor(input: IRTensor): org.nd4j.linalg.api.buffer.DataType { - return input.dataType().nd4jDataType() - } - - override fun createIRTensorFromNDArray(ndarray: INDArray): IRTensor { - val tensorProto = TensorProto { - RawData(ndarray.data().asBytes()) - Shape(ndarray.shape().toList()) - DataType(convertToDataType(ndarray.dataType())) - } - - return TensorflowIRTensor(tensorProto) - } - - override fun tensorAttributeFor(name: String): IRTensor { - return TensorflowIRTensor(node.getAttrOrThrow(name).tensor) - } - - override fun irNode(): IRNode { - return TensorflowIRNode(node, tensorflowOps.findOp(node.op)) - } - - override fun tensorInputFromInputFrameworkName(name: String): IRTensor { - val searchedNode = graph.nodeByName(stripVarSuffix(name)) - //no value to be found on placeholder, return default instance - //if no value exists it's an output from another node - if("Placeholder" in searchedNode.op || !searchedNode.containsAttr("value")) { - println("Value for node $name is not a constant! This method only works for constants. Consider replacing the Placeholder node with a Constant node. This will return an empty tensor.") - if(!dynamicVariables.containsKey(name)) - return TensorflowIRTensor(TensorProto.getDefaultInstance()) - else { - val toConvert = dynamicVariables[name]!! - return TensorflowIRTensor(toConvert) - } - } - - //value nodes are the values of attributes that are input nodes in a frozen graph - return TensorflowIRTensor(searchedNode.getAttrOrThrow("value").tensor) - } - - -} - -fun tensorflowAttributeValueTypeFor(attributeName: String, opDef: OpDef): AttributeValueType { - val names = opDef.attrList.map { attrDef -> attrDef.name } - if(!names.contains(attributeName) && !isTensorflowTensorName(attributeName,opDef)) { - throw java.lang.IllegalArgumentException("Tensorflow op ${opDef.name} does not have attribute name $attributeName") - } else if(isTensorflowTensorName(attributeName,opDef)) { - //note we allows tensors here since sometimes input tensors in tensorflow become attributes in nd4j - return AttributeValueType.TENSOR - } - val attrDef = opDef.attrList.first { attrDef -> attrDef.name == attributeName } - return TensorflowIRAttr(attrDef, AttrValue.getDefaultInstance()).attributeValueType() -} - - - -fun isTensorflowTensorName(name: String, opDef: OpDef): Boolean { - return opDef.inputArgList.map {inputDef -> inputDef.name }.contains(name) -} - - -fun isTensorflowAttributeName(name: String, opDef: OpDef): Boolean { - return opDef.attrList.map { attrDef -> attrDef.name }.contains(name) -} - -/** - * fun initAttributes( -df: DifferentialFunction, -applied: Pair, OpNamespace.OpDescriptor>, -mappingContext: MappingContext, -sd: SameDiff -) - */ -//fun initAttributesTensorflow() - - - - - - - -/** - * Get the shape from a TensorShapeProto - * - * @param tensorShapeProto Shape - * @return Shape as long[] - */ -private fun shapeFromShapeProto(tensorShapeProto: TensorShapeProto): LongArray? { - val shape = LongArray(tensorShapeProto.dimList.size) - for (i in shape.indices) { - shape[i] = tensorShapeProto.getDim(i).size - } - return shape -} - -/** - * Convert from TF proto datatype to ND4J datatype - * - * @param tfType TF datatype - * @return ND4J datatype - */ -fun convertType(tfType: DataType?): org.nd4j.linalg.api.buffer.DataType { - return when (tfType) { - DataType.DT_DOUBLE -> org.nd4j.linalg.api.buffer.DataType.DOUBLE - DataType.DT_FLOAT -> org.nd4j.linalg.api.buffer.DataType.FLOAT - DataType.DT_HALF -> org.nd4j.linalg.api.buffer.DataType.HALF - DataType.DT_BFLOAT16 -> org.nd4j.linalg.api.buffer.DataType.BFLOAT16 - DataType.DT_INT8 -> org.nd4j.linalg.api.buffer.DataType.BYTE - DataType.DT_INT16 -> org.nd4j.linalg.api.buffer.DataType.SHORT - DataType.DT_INT32 -> org.nd4j.linalg.api.buffer.DataType.INT - DataType.DT_INT64 -> org.nd4j.linalg.api.buffer.DataType.LONG - DataType.DT_UINT8 -> org.nd4j.linalg.api.buffer.DataType.UBYTE - DataType.DT_STRING -> org.nd4j.linalg.api.buffer.DataType.UTF8 - DataType.DT_BOOL -> org.nd4j.linalg.api.buffer.DataType.BOOL - else -> org.nd4j.linalg.api.buffer.DataType.UNKNOWN - } -} - -/** - * @return True if the specified name represents a control dependency (starts with "^") - */ -fun isControlDep(name: String): Boolean { - return name.startsWith("^") -} - -/** - * @return The specified name without the leading "^" character (if any) that appears for control dependencies - */ -fun stripControl(name: String): String { - return if (name.startsWith("^")) { - name.substring(1) - } else name -} - -/** - * Remove the ":1" etc suffix for a variable name to get the op name - * - * @param varName Variable name - * @return Variable name without any number suffix - */ -fun stripVarSuffix(varName: String): String { - if (varName.matches(regex = Regex(".*:\\d+"))) { - val idx = varName.lastIndexOf(':') - return varName.substring(0, idx) - } - return varName -} - -/** - * Convert the tensor to an NDArray (if possible and if array is available) - * - * @param node Node to get NDArray from - * @return NDArray - */ -fun getNDArrayFromTensor(node: NodeDef): INDArray? { - //placeholder of some kind - if (!node.attrMap.containsKey("value")) { - return null - } - val tfTensor = node.getAttrOrThrow("value").tensor - return mapTensorProto(tfTensor) -} - -/** - * Convert a TensorProto to an INDArray - * - * @param tfTensor Tensor proto - * @return INDArray - */ -fun mapTensorProto(tfTensor: TensorProto): INDArray { - val m = TFTensorMappers.newMapper(tfTensor) ?: throw RuntimeException("Not implemented datatype: " + tfTensor.dtype) - return m.toNDArray() -} - -fun attributeValueTypeForTensorflowAttribute(attributeDef: AttrDef): AttributeValueType { - when(attributeDef.type) { - "list(bool)" -> return AttributeValueType.LIST_BOOL - "bool" -> return AttributeValueType.BOOL - "string" -> return AttributeValueType.STRING - "list(string)" -> return AttributeValueType.LIST_STRING - "int" -> return AttributeValueType.INT - "list(int)" -> return AttributeValueType.LIST_INT - "float" -> return AttributeValueType.FLOAT - "list(float)" -> return AttributeValueType.LIST_FLOAT - "tensor" -> return AttributeValueType.TENSOR - "list(tensor)" -> return AttributeValueType.LIST_TENSOR - "type" -> return AttributeValueType.DATA_TYPE - } - - return AttributeValueType.INVALID -} - - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowMappingProcess.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowMappingProcess.kt deleted file mode 100644 index 1bb72260a..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowMappingProcess.kt +++ /dev/null @@ -1,53 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import org.nd4j.codegen.ir.AbstractMappingProcess -import org.nd4j.codegen.ir.AttributeMappingRule -import org.nd4j.codegen.ir.AttributeValueType -import org.nd4j.codegen.ir.TensorMappingRule -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.nd4j.codegen.ir.registry.OpRegistryHolder -import org.nd4j.common.base.Preconditions -import org.tensorflow.framework.* - -open class TensorflowMappingProcess(inputFramework: String = "tensorflow", - frameworkVersion: String = "2.3", - inputFrameworkOpName: String, - opName: String, - opMappingRegistry: OpMappingRegistry, - tensorMappingRules: List> = emptyList(), - attributeMappingRules: List> = emptyList(), - inputIndexOverrides: Map = emptyMap()) - : AbstractMappingProcess( - inputFramework, - frameworkVersion, - inputFrameworkOpName, - inputIndexOverrides, - opName, - opMappingRegistry, - tensorMappingRules, - attributeMappingRules) { - override fun inputOpDefValueTypes(): Map { - Preconditions.checkNotNull(inputFrameworkOpName,"No input framework op def name found!") - val opDef = opMappingRegistry.lookupInputFrameworkOpDef(inputFrameworkOpName) - val retMap = HashMap() - opDef.attrList.forEach { attrDef -> - retMap[attrDef.name] = attributeValueTypeForTensorflowAttribute(attrDef) - } - - return retMap - - } - -} - - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowOpDeclarations.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowOpDeclarations.kt deleted file mode 100644 index 6388a3c6d..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowOpDeclarations.kt +++ /dev/null @@ -1,1916 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import org.nd4j.codegen.ir.ArgDescriptor -import org.nd4j.codegen.ir.nameSpaceTensorFromNDarray -import org.nd4j.codegen.ir.nd4jOpDescriptors -import org.nd4j.codegen.ir.onnx.* -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.nd4j.codegen.ir.registry.OpRegistryHolder -import org.nd4j.ir.OpNamespace -import org.nd4j.linalg.factory.Nd4j -import org.tensorflow.framework.* - -val tensorflowOpRegistry = OpMappingRegistry("tensorflow") - -val singleTransformArgs = mapOf( - "Abs" to "abs", - "Acos" to "acos", - "Acosh" to "acosh", - "Asin" to "asin", - "Asinh" to "asinh", - "Atan" to "atan", - "Atanh" to "atanh", - "Ceil" to "ceil", - "Cos" to "cos", - "Cosh" to "cosh", - "Erf" to "erf", - "Erfc" to "erfc", - "Exp" to "exp", - "Expm1" to "expm1", - "Floor" to "floor", - "Log" to "log", - "Log1p" to "log1p", - "Neg" to "neg", - "Rint" to "rint", - "Round" to "round", - "Rsqrt" to "rsqrt", - "Sigmoid" to "sigmoid", - "Sign" to "sign", - "Sin" to "sin", - "Sinh" to "sinh", - "Square" to "square", - "Sqrt" to "sqrt", - "Tan" to "tan", - "Tanh" to "tanh" -) - -val elementWiseTransformOps = mapOf( - "Add" to "add", - "AddV2" to "add", - "Div" to "divide", - "Greater" to "greater", - "GreaterEqual" to "greater_equal", - "Less" to "less", - "LessEqual" to "less_equal", - "Mul" to "multiply", - "Maximum" to "maximum", - "FloorDiv" to "floordiv", - "Mod" to "mod", - "FloorMod" to "fmod", - "SquaredDifference" to "squaredsubtract", - "NotEqual" to "not_equals", - "RealDiv" to "realdiv", - "RightShift" to "rshift_bits", - "Atan2" to "tf_atan2", - "TruncateDiv" to "truncatediv" -) - - -val reduceOps = mapOf( - //"AccumulateNV2" to "mergeadd", - "All" to "all", - "Any" to "any", - "Mean" to "reduce_mean", - "Prod" to "reduce_prod", - "Sum" to "reduce_sum", - "Min" to "reduce_min", - "Max" to "reduce_max", - - ) - - -val pairwiseReduceOps = mapOf( - "EuclideanNorm" to "euclidean" -) - - -val addN = TensorflowMappingProcess( - inputFrameworkOpName = "AddN", - opName = "mergesum", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "inputs"))), - opMappingRegistry = tensorflowOpRegistry -) - - -val assert = mapTensorNamesWithOp(inputFrameworkOpName = "Assert",opName = "Assert",tensorNames = mutableMapOf("input" to "condition")) - - -/** - * - * Note that angle only supports complex inputs and outputs. - * We don't support complex in nd4j so we just output zeros. - */ -/*val angleRule = TensorflowMappingProcess( - inputFrameworkOpName = "Angle", - opName = "zeros_like", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - opMappingRegistry = tensorflowOpRegistry -)*/ - -/* -val approxEqualRule = TensorflowMappingProcess( - inputFrameworkOpName = "Equal", - opName = "equals_with_eps", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","y" to "y"))), - attributeMappingRules = listOf(valueMapping(mapOf("eps" to "tolerance")), - //TODO: note dimensions isn't on the TF op, need to investigate if there is a better default here - intConstant(inputName = "dimensions",constantValue = 0 as Integer,argumentIndex = 0)[0], - booleanConstant(inputName = "keepDims",constantValue = false,argumentIndex = 0)[0])) -*/ - - -val argMaxRule = TensorflowMappingProcess( - inputFrameworkOpName = "ArgMax", - opName = "argmax", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf( - argDescriptorConstant(listOf(ArgDescriptor { - argIndex = 0 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = "dimensions" - int64Value = -1 - })), - booleanConstant(inputName = "keepDims",constantValue = false,argumentIndex = 0)[0]) - -) - -val argMinRule = TensorflowMappingProcess( - inputFrameworkOpName = "ArgMin", - opName = "argmin", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf( - argDescriptorConstant(listOf(ArgDescriptor { - argIndex = 0 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = "dimensions" - int64Value = -1 - })), - booleanConstant(inputName = "keepDims",constantValue = false,argumentIndex = 0)[0]) - -) -/* -val reduceLogSumExp = TensorflowMappingProcess( - inputFrameworkOpName = "CumulativeLogsumexp", - opName = "reduce_logsumexp", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x"))), - attributeMappingRules = listOf( - ndarrayToIntList(mutableMapOf("dimensions" to "axis")), - booleanConstant(inputName = "keepDims",constantValue = true,argumentIndex = 0)[0]) - -)*/ - -/** - * Note: Assign uses variables, not tensors. We will not test this. - */ -val assignOp = TensorflowMappingProcess( - inputFrameworkOpName = "Assign", - opName = "assign", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "ref","y" to "value"))) -) - -val adjustHue = TensorflowMappingProcess( - inputFrameworkOpName = "AdjustHue", - opName = "adjust_hue", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "images","delta" to "delta"))), - attributeMappingRules = listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("factor" to "delta"))) -) - -val adjustSaturation = TensorflowMappingProcess( - inputFrameworkOpName = "AdjustSaturation", - opName = "adjust_saturation", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "images","factor" to "scale"))), - attributeMappingRules = listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("factor" to "scale"))) -) - - -val avgPool = TensorflowMappingProcess( - inputFrameworkOpName = "AvgPool", - opName = "avgpool2d", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "value"))), - attributeMappingRules = listOf( - stringNotEqualsRule(outputAttribute = "isNCHW",inputFrameworkAttributeName = "data_format",valueToTest = "NCHW",argumentIndex = 10), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 8), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 2), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 3), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kH", attributeNameOfListAttribute = "ksize", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 0), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kW", attributeNameOfListAttribute = "ksize", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 1), - argDescriptorConstant(listOf( - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = "pH" - int64Value = 0 - argIndex = 4 - }, - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = "pW" - int64Value = 0 - argIndex = 5 - }, - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = "dW" - int64Value = 1 - argIndex = 6 - }, - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = "dH" - int64Value = 1 - argIndex = 7 - }, - ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - name = "extraParam0" - int64Value = 1 - argIndex = 9 - } - )) - ) -) - -val avgPool3d = TensorflowMappingProcess( - inputFrameworkOpName = "AvgPool3D", - opName = "avgpool3dnew", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf( - intConstant(inputName = "extraParam0",constantValue = 0 as Integer,argumentIndex = 13)[0], - intConstant(inputName = "pD",constantValue = 1 as Integer,argumentIndex = 6)[0], - intConstant(inputName = "pH",constantValue = 1 as Integer,argumentIndex = 7)[0], - intConstant(inputName = "pW",constantValue = 1 as Integer,argumentIndex = 8)[0], - intConstant(inputName = "dD",constantValue = 1 as Integer,argumentIndex = 9)[0], - intConstant(inputName = "dH",constantValue = 1 as Integer,argumentIndex = 10)[0], - intConstant(inputName = "dW",constantValue = 1 as Integer,argumentIndex = 11)[0], - stringEqualsRule(outputAttribute = "isNCDHW",inputFrameworkAttributeName = "data_format",valueToTest = "NDHWC",argumentIndex = 14), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 12), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kH", attributeNameOfListAttribute = "ksize", targetValue = "NDHWC", trueIndex = 2, - falseIndex = 4,inputFrameworkStringNameToTest = "data_format",argumentIndex = 1), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kW", attributeNameOfListAttribute = "ksize", targetValue = "NDHWC", trueIndex = 4, - falseIndex = 5,inputFrameworkStringNameToTest = "data_format",argumentIndex = 2), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kD", attributeNameOfListAttribute = "ksize", targetValue = "NDHWC", trueIndex = 1, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 0), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NDHWC", trueIndex = 2, - falseIndex = 4,inputFrameworkStringNameToTest = "data_format",argumentIndex = 4), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NDHWC", trueIndex = 4, - falseIndex = 5,inputFrameworkStringNameToTest = "data_format",argumentIndex = 5), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sD", attributeNameOfListAttribute = "strides", targetValue = "NDHWC", trueIndex = 1, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 3) - - - ) -) - -val batchToSpace = TensorflowMappingProcess( - opName = "batch_to_space", - inputFrameworkOpName = "BatchToSpace", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(valueMapping(mapOf("blockSize" to "block_size"))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input","crop" to "crops"))) -) - -val batchToSpaceND = TensorflowMappingProcess( - opName = "batch_to_space_nd", - inputFrameworkOpName = "BatchToSpaceND", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("blocks" to "block_shape")), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0]), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input","crop" to "crops","blockShape" to "block_shape"))) -) - -val betaInc = TensorflowMappingProcess( - opName = "betainc", - inputFrameworkOpName = "Betainc", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("a" to "a","b" to "b","input" to "x"))), - attributeMappingRules = emptyList() -) - -val biasAddResult = defineBiasAdd() - -val binCount = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - opName = "bincount", - inputFrameworkOpName = "Bincount", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("weights" to "weights","values" to "arr"))), - attributeMappingRules = listOf( - argDescriptorConstant(listOf( - ArgDescriptor { - name = "minLength" - argIndex = 0 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - int64Value = 0 - } - )), - convertNDArrayInputToNumericalAttr(mutableMapOf("maxLength" to "size")), - valueMapping(mutableMapOf("outputType" to "T"))), - inputIndexOverrides = mapOf(1 to 2,2 to 1)) - - -val bitCast = TensorflowMappingProcess( - opName = "bitcast", - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "Bitcast", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf(dataTypeToInt(mutableMapOf("newType" to "type"))) -) - -val bitwiseAnd = TensorflowMappingProcess( - opName = "bitwise_and", - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "BitwiseAnd", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","y" to "y"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0) -) - -val bitwiseOr = TensorflowMappingProcess( - opName = "bitwise_or", - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "BitwiseOr", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","y" to "y"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0) -) - - - -val bitwiseXOr = TensorflowMappingProcess( - opName = "bitwise_xor", - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "BitwiseXor", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","y" to "y"))), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0) -) - -val broadcastDynamicShape = TensorflowMappingProcess( - opName = "broadcast_dynamic_shape", - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "BroadcastArgs", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "s0","y" to "s1"))) -) - -//TODO: not implemented yet - -/*val broadcastCatGradientArgs = TensorflowMappingProcess( - opName = "broadcastgradientargs", - inputFrameworkOpName = "BroadcastGradientArgs", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0], - intConstant(inputName = "dimension",constantValue = 0 as Integer,argumentIndex = 0)[0]), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "s0","y" to "s1"))) -)*/ - -val broadcastTo = TensorflowMappingProcess( - opName = "broadcast_to", - inputFrameworkOpName = "BroadcastTo", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input","shape" to "shape"))) -) - - -val copy2 = multipleNameMapping( - inputFrameworkOpNames = listOf("Copy"), - opName = "copy", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorNames = mutableMapOf("input" to "input") -) - -val checkNumerics = TensorflowMappingProcess( - opName = "check_numerics", - inputFrameworkOpName = "CheckNumerics", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(convertStringToInputNDArray(mutableMapOf("message" to "message"))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "tensor"))) -) - -//only exists in tf2, tf-java can't run it - -val checkNumericsV2 = TensorflowMappingProcess( - opName = "check_numerics", - inputFrameworkOpName = "CheckNumericsV2", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(convertStringToInputNDArray(mutableMapOf("message" to "message"))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "tensor"))) -) - - -val variable = mapTensorNamesWithOp(inputFrameworkOpName = "Variable", - opName = "identity", - tensorNames = mutableMapOf(), - attributeMappingRules = listOf( - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0])) - -val variableV2 = mapTensorNamesWithOp(inputFrameworkOpName = "VariableV2", - opName = "identity", - tensorNames = mutableMapOf(), - attributeMappingRules = listOf( - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0])) - - - -val identity2 = mapTensorNamesWithOp(inputFrameworkOpName = "Identity", - opName = "identity", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf( - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0])) - -val const = mapTensorNamesWithOp(inputFrameworkOpName = "Const", - opName = "identity", - tensorNames = mutableMapOf(), - attributeMappingRules = listOf(ndArrayAttributeToNDarrayInput(mutableMapOf("input" to "value")), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0])) - - -val cholesky = TensorflowMappingProcess( - opName = "cholesky", - inputFrameworkOpName = "Cholesky", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = mapSameName(listOf("input")) -) - - -val clipByValue = TensorflowMappingProcess( - opName = "ClipByValue", - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "ClipByValue", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "t"))), - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf("clipValueMin" to "clip_value_min","clipValueMax" to "clip_value_max")), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0]) -) - - -//TODO: our compare and bit pack operation seems to do something different than TFs? -/* -val compareAndBitPack = TensorflowMappingProcess( - opName = "compare_and_bitpack", - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "CompareAndBitpack", - attributeMappingRules = listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("threshold" to "threshold"))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input","y" to "threshold"))) -) -*/ - - -val concat = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - opName = "concat", - inputFrameworkOpName = "Concat", - tensorMappingRules = listOf(mappingListNDArrays(mutableMapOf("input" to "values","concatDimension" to "concat_dim"))), - attributeMappingRules = listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("concatDimension" to "concat_dim")), - booleanConstant(inputName = "isDynamicAxis",constantValue = true,argumentIndex = 0)[0]) -) - -val concatv2 = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - opName = "concat", - inputFrameworkOpName = "ConcatV2", - tensorMappingRules = listOf(mappingListNDArrays(mutableMapOf("input" to "values","concatDimension" to "axis"))), - attributeMappingRules = listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("concatDimension" to "axis")), - booleanConstant(inputName = "isDynamicAxis",constantValue = true,argumentIndex = 0)[0])) - - -/*val parallelConcat = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - opName = "concat", - inputFrameworkOpName = "ParallelConcat", - tensorMappingRules = listOf(mappingListNDArrays(mutableMapOf("input" to "values"))), - attributeMappingRules = listOf( - intConstant(inputName = "concatDimension",constantValue = 0 as Integer,argumentIndex = 0)[0], - booleanConstant(inputName = "isDynamicAxis",constantValue = true,argumentIndex = 0)[0]) -)*/ - -//TODO Reference ImportClassMapping.java -//TODO: ParallelDynamicStitch, map to dynamic stitch -//TODO: PollyGamma, map to pairwise transforms -//TODO: map QR - -val cropAndResize = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - opName = "crop_and_resize", - inputFrameworkOpName = "CropAndResize", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "image" to "image", - "boxes" to "boxes", - "boxIndexes" to "box_ind", - "newImageSize" to "crop_size"))), - attributeMappingRules = listOf( - ndarrayStringToIndex(outputAttributeValue = "method",inputAttributeValue = "method",listOfValues = listOf("bilinear","nearest"),argumentIndex = 0), - valueMapping(mapOf("extrapolationVal" to "extrapolation_value"))) -) - -val cumProd = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - opName = "cumprod", - inputFrameworkOpName = "Cumprod", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","dimensions" to "axis"))), - attributeMappingRules = listOf(invertBooleanNumber(mutableMapOf("exclusive" to "exclusive","reverse" to "reverse")))) - - - - -val cumSum = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - opName = "cumsum", - inputFrameworkOpName = "Cumsum", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","dimensions" to "axis"))), - attributeMappingRules = listOf( - invertBooleanNumber(mutableMapOf("exclusive" to "exclusive", - "reverse" to "reverse")))) - - - - -val cross = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - opName = "cross", - inputFrameworkOpName = "Cross", - tensorMappingRules = mapSameName(listOf("a","b")) -) - -val depthToSpace = TensorflowMappingProcess( - opName = "depth_to_space", - inputFrameworkOpName = "DepthToSpace", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf(valueMapping(mapOf("block_size" to "block_size")), - stringEqualsRule("isNHWC" - ,inputFrameworkAttributeName = "data_format",valueToTest = "NHWC",argumentIndex = 1)), - opMappingRegistry = tensorflowOpRegistry -) - -/** - * depth_conv - */ -val depthWiseConv2d = TensorflowMappingProcess( - opName = "depthwise_conv2d", - inputFrameworkOpName = "DepthwiseConv2dNative", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "input" to "input","weights" to "filter"))), - attributeMappingRules = listOf( - stringNotEqualsRule(outputAttribute = "isNCHW",inputFrameworkAttributeName = "data_format",valueToTest = "NCHW",argumentIndex = 9), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 8), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 2), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 3), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "dH", attributeNameOfListAttribute = "dilations", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 6), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "dW", attributeNameOfListAttribute = "dilations", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 7), - //NOTE: This is a dynamically resolved attribute at runtime. - sizeAtRule(outputAttributeName = "kH",dimensionIndex = 0,inputFrameworkAttributeName = "filter",argumentIndex = 0), - sizeAtRule(outputAttributeName = "kW",dimensionIndex = 1,inputFrameworkAttributeName = "filter",argumentIndex = 1), - argDescriptorConstant(listOf( - ArgDescriptor { - name = "pH" - int64Value = 1 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - argIndex = 4 - }, - ArgDescriptor { - name = "pW" - int64Value = 1 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - argIndex = 5 - }, - ArgDescriptor { - name = "wFormat" - int64Value = 0 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - argIndex = 10 - } - ))) -) - - -val diag = TensorflowMappingProcess( - inputFrameworkOpName = "Diag", - opName = "diag", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "diagonal"))), - opMappingRegistry = tensorflowOpRegistry -) - - -val diagPart = TensorflowMappingProcess( - inputFrameworkOpName = "DiagPart", - opName = "diag_part", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - opMappingRegistry = tensorflowOpRegistry -) - -val lGamma = TensorflowMappingProcess( - inputFrameworkOpName = "Lgamma", - opName = "lgamma", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x"))), - opMappingRegistry = tensorflowOpRegistry -) - - -val diGamma = TensorflowMappingProcess( - inputFrameworkOpName = "Digamma", - opName = "digamma", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x"))), - opMappingRegistry = tensorflowOpRegistry -) - -val iGamma = TensorflowMappingProcess( - inputFrameworkOpName = "Igamma", - opName = "igamma", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "a","y" to "x"))), - opMappingRegistry = tensorflowOpRegistry -) - -val iGammaC = TensorflowMappingProcess( - inputFrameworkOpName = "Igamma", - opName = "igamma", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "a","y" to "x"))), - opMappingRegistry = tensorflowOpRegistry -) - -val dilation2D = TensorflowMappingProcess( - opName = "dilation2d", - inputFrameworkOpName = "Dilation2D", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "input" to "input","weights" to "filter"))), - attributeMappingRules = listOf( - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 0), - listNumberToListNumber(outputAttributeValue = "rates",inputAttributeValue = "rates"), - listNumberToListNumber(outputAttributeValue = "strides", - inputAttributeValue = "strides")) -) - -val drawBoundingBoxes = TensorflowMappingProcess( - inputFrameworkOpName = "DrawBoundingBoxesV2", - inputFramework = "tensorflow", - opName = "draw_bounding_boxes", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("images" to "images","boxes" to "boxes","colors" to "colors"))) -) - - -val conv2d = TensorflowMappingProcess( - inputFramework = "tensorflow", - inputFrameworkOpName = "Conv2D", - opName = "conv2d", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "input" to "input","weights" to "filter"))), - attributeMappingRules = listOf( - intConstant(inputName = "pH",constantValue = 0 as Integer,argumentIndex = 4)[0], - intConstant(inputName = "pW",constantValue = 0 as Integer,argumentIndex = 5)[0], - intConstant(inputName = "wFormat",constantValue = 0 as Integer,argumentIndex = 10)[0], - stringNotEqualsRule(outputAttribute = "isNCHW",inputFrameworkAttributeName = "data_format",valueToTest = "NCHW",argumentIndex = 9), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 8), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 2), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 3), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "dH", attributeNameOfListAttribute = "dilations", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 6), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "dW", attributeNameOfListAttribute = "dilations", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 7), - //NOTE: This is a dynamically resolved attribute at runtime. - sizeAtRule(outputAttributeName = "kH",dimensionIndex = 0,inputFrameworkAttributeName = "filter",argumentIndex = 0), - sizeAtRule(outputAttributeName = "kW",dimensionIndex = 1,inputFrameworkAttributeName = "filter",argumentIndex = 1) - ),opMappingRegistry = tensorflowOpRegistry) - -/** - * TODO: verify the amounts - */ -val conv3d = TensorflowMappingProcess( - inputFrameworkOpName = "Conv3D", - opName = "conv3dnew", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "input" to "input","weights" to "filter"))), - attributeMappingRules = listOf( - stringEqualsRule(outputAttribute = "isNCDHW",inputFrameworkAttributeName = "data_format",valueToTest = "NDHWC",argumentIndex = 13), - stringEqualsRule(outputAttribute = "paddingMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 12), - intConstant(inputName = "pD",constantValue = 1 as Integer,argumentIndex = 6)[0], - intConstant(inputName = "pH",constantValue = 1 as Integer,argumentIndex = 7)[0], - intConstant(inputName = "pW",constantValue = 1 as Integer,argumentIndex = 8)[0], - sizeAtRule(outputAttributeName = "kH",dimensionIndex = 1,inputFrameworkAttributeName = "filter",argumentIndex = 1), - sizeAtRule(outputAttributeName = "kW",dimensionIndex = 2,inputFrameworkAttributeName = "filter",argumentIndex = 2), - sizeAtRule(outputAttributeName = "kD",dimensionIndex = 0,inputFrameworkAttributeName = "filter",argumentIndex = 0), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NDHWC", trueIndex = 2, - falseIndex = 4,inputFrameworkStringNameToTest = "data_format",argumentIndex = 4), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NDHWC", trueIndex = 4, - falseIndex = 5,inputFrameworkStringNameToTest = "data_format",argumentIndex = 5), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sD", attributeNameOfListAttribute = "strides", targetValue = "NDHWC", trueIndex = 1, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 3), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "dW", attributeNameOfListAttribute = "dilations", targetValue = "NDHWC", trueIndex = 2, - falseIndex = 4,inputFrameworkStringNameToTest = "data_format",argumentIndex = 11), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "dH", attributeNameOfListAttribute = "dilations", targetValue = "NDHWC", trueIndex = 4, - falseIndex = 5,inputFrameworkStringNameToTest = "data_format",argumentIndex = 10), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "dD", attributeNameOfListAttribute = "dilations", targetValue = "NDHWC", trueIndex = 1, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 9) - - - ),opMappingRegistry = tensorflowOpRegistry) - - - - -val divideNoNan = TensorflowMappingProcess( - opName = "divide_no_nan", - inputFrameworkOpName = "DivNoNan", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","y" to "y"))), - opMappingRegistry = tensorflowOpRegistry -) - -val dynamicPartition = TensorflowMappingProcess( - opName = "dynamic_partition", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data","indices" to "partitions"))), - inputFrameworkOpName = "DynamicPartition", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(valueMapping(mapOf("numPartitions" to "num_partitions"))) -) - - -/** - * TODO: check if n attribute has value for tensorflow - */ -val dynamicStitch = TensorflowMappingProcess( - opName = "dynamic_stitch", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("index" to "data","input" to "indices"))), - attributeMappingRules = listOf(valueMapping(mutableMapOf("numPartitions" to "N"))), - inputFrameworkOpName = "DynamicStitch", - opMappingRegistry = tensorflowOpRegistry -) - -val empty = TensorflowMappingProcess( - opName = "create", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "shape"))), - inputFrameworkOpName = "Empty", - attributeMappingRules = listOf(valueMapping(mapOf("init" to "init","outputType" to "dtype")), - dataTypeToInt(mutableMapOf("outputType" to "dtype")), - intConstant(inputName = "order",constantValue = 'c'.toInt() as Integer,argumentIndex = 0)[0]), - opMappingRegistry = tensorflowOpRegistry -) - - -val elu = mapTensorNamesWithOp(inputFrameworkOpName = "Elu",opName = "elu",tensorNames = mutableMapOf("input" to "features"), - attributeMappingRules = listOf(argDescriptorConstant( - listOf( - ArgDescriptor { - name = "alpha" - doubleValue = 1.0 - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - argIndex = 0 - } - ) - ))) - -val enter = TensorflowMappingProcess( - opName = "enter", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - inputFrameworkOpName = "Enter", - attributeMappingRules = listOf(valueMapping(mapOf("isConstant" to "is_constant"))), - opMappingRegistry = tensorflowOpRegistry -) - -val equal = TensorflowMappingProcess( - opName = "equals", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","y" to "y"))), - inputFrameworkOpName = "Equal", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - opMappingRegistry = tensorflowOpRegistry -) - -val approxEqual = TensorflowMappingProcess( - opName = "equals", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","y" to "y"))), - inputFrameworkOpName = "ApproximateEqual", - attributeMappingRules = listOf(booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0]), - opMappingRegistry = tensorflowOpRegistry -) - -val exit = TensorflowMappingProcess( - opName = "exit", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data"))), - inputFrameworkOpName = "Exit", - opMappingRegistry = tensorflowOpRegistry -) - -val expandDims = TensorflowMappingProcess( - opName = "expand_dims", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - inputFrameworkOpName = "ExpandDims", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(ndarrayToIntList(ndarrayNameToAttributeName = mutableMapOf("dimensions" to "dim")) - )) - -val extractImagesPatches = TensorflowMappingProcess( - opName = "extract_image_patches", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "images"))), - inputFrameworkOpName = "ExtractImagePatches", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf( - listAttributeValueLookupToIndex(outputAttributeValue = "ksizeRows",inputAttributeValue = "ksizes",idx = 0,argumentIndex = 0), - listAttributeValueLookupToIndex(outputAttributeValue = "ksizeCols",inputAttributeValue = "ksizes",idx = 1,argumentIndex = 1), - listAttributeValueLookupToIndex(outputAttributeValue = "kstrideRows",inputAttributeValue = "strides",idx = 0,argumentIndex = 2), - listAttributeValueLookupToIndex(outputAttributeValue = "kstrideCols",inputAttributeValue = "strides",idx = 1,argumentIndex = 3), - listAttributeValueLookupToIndex(outputAttributeValue = "krateRows",inputAttributeValue = "rates",idx = 1,argumentIndex = 4), - listAttributeValueLookupToIndex(outputAttributeValue = "krateCols",inputAttributeValue = "rates",idx = 1,argumentIndex = 5), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 0)) -) - - - - -val fusedBatchnormV2 = TensorflowMappingProcess( - opName = "fused_batch_norm", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","scale" to "scale", - "offset" to "offset","mean" to "mean","variance" to "variance"))), - inputFrameworkOpName = "FusedBatchNormV2", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(valueMapping(mutableMapOf("epsilon" to "epsilon")), - invertBooleanNumber(mutableMapOf("isTraining" to "is_training")), - stringEqualsRule(outputAttribute = "dataFormat",inputFrameworkAttributeName = "data_format",valueToTest = "NCHW",argumentIndex = 0)) -) - -//tf2 op -val fusedBatchnormV3 = TensorflowMappingProcess( - opName = "fused_batch_norm", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x","scale" to "scale", - "offset" to "offset","mean" to "mean","variance" to "variance"))), - inputFrameworkOpName = "FusedBatchNormV3", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(valueMapping(mutableMapOf("epsilon" to "epsilon")), - invertBooleanNumber(mutableMapOf("isTraining" to "is_training")), - stringEqualsRule(outputAttribute = "dataFormat",inputFrameworkAttributeName = "data_format",valueToTest = "NCHW",argumentIndex = 0)) -) - - - -val gather = TensorflowMappingProcess( - opName = "gather", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "params","indices" to "indices"))), - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf()), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0], - intConstant(inputName = "dimensions",constantValue = 0 as Integer,argumentIndex = 0)[0]), - inputFrameworkOpName = "Gather", - opMappingRegistry = tensorflowOpRegistry -) - -val gatherNd = TensorflowMappingProcess( - opName = "gather_nd", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "params","indices" to "indices"))), - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf()), - booleanConstant(inputName = "checkIndices",constantValue = false,argumentIndex = 0)[0]), - inputFrameworkOpName = "GatherNd", - opMappingRegistry = tensorflowOpRegistry -) - -val histogramFixedWidth = TensorflowMappingProcess( - opName = "histogram_fixed_width", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "values","range" to "value_range","numBins" to "nbins"))), - inputFrameworkOpName = "HistogramFixedWidth", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("nbins" to "nbins"))) -) - -val identity = multipleNameMapping( - opName = "identity", - inputFrameworkOpNames = listOf("DeepCopy"), - tensorNames = mutableMapOf("input" to "x"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - - -val identityCopyToHost = multipleNameMapping( - opName = "identity", - inputFrameworkOpNames = listOf("CopyHost"), - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - -val identityN = TensorflowMappingProcess( - opName = "identity_n", - inputFrameworkOpName = "IdentityN", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))) -) - -val ifOp = TensorflowMappingProcess( - opName = "Switch", - inputFrameworkOpName = "If", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input","condition" to "cond"))) -) - - - -val reciprocal = TensorflowMappingProcess( - opName = "Reciprocal", - inputFrameworkOpName = "Inv", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "x"))) -) - -val inTopKResults = multipleNameMapping(inputFrameworkOpNames = listOf("InTopK"), - opName = "in_top_k", - tensorNames = mutableMapOf("target" to "targets","predictions" to "predictions"), - attributeMappingRules = listOf(valueMapping(mutableMapOf("k" to "k")), - booleanConstant(inputName = "sorted",constantValue = true,argumentIndex = 0)[0])) - - -val inTopKResults2 = multipleNameMapping(inputFrameworkOpNames = listOf("InTopKV2"), - opName = "in_top_k", - tensorNames = mutableMapOf("target" to "targets","predictions" to "predictions"), - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("k" to "k")), - booleanConstant(inputName = "sorted",constantValue = true,argumentIndex = 0)[0])) - -val invert = mapTensorNamesWithOp(inputFrameworkOpName = "Invert",opName = "toggle_bits",tensorNames = mutableMapOf("input" to "x")) -val invertPermutation = mapTensorNamesWithOp(inputFrameworkOpName = "InvertPermutation", - opName = "invert_permutation",tensorNames = mutableMapOf("input" to "x"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) -val isFinite = mapTensorNamesWithOp(inputFrameworkOpName = "IsFinite",opName = "isfinite",tensorNames = mutableMapOf("input" to "x"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) -val isInf = mapTensorNamesWithOp(inputFrameworkOpName = "IsInf",opName = "isinf", - tensorNames = mutableMapOf("input" to "x"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) -val isNan = mapTensorNamesWithOp(inputFrameworkOpName = "IsNan",opName = "isnan", - tensorNames = mutableMapOf("input" to "x"),attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) -//TODO: weird parameter values with config.getBias( and other similar names -val lrn = mapTensorNamesWithOp(inputFrameworkOpName = "LRN",opName = "lrn", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf(valueMapping(mutableMapOf("depth" to "depth_radius","alpha" to "alpha", - "bias" to "bias","beta" to "beta")), - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0])) - -val leakyRelu = mapTensorNamesWithOp(inputFrameworkOpName = "LeakyRelu",opName = "leakyrelu", - attributeMappingRules = listOf(valueMapping(mappings = mutableMapOf("alpha" to "alpha"))), - tensorNames = mutableMapOf("input" to "features")) -//TODO: no input values found -val leftShift = mapTensorNamesWithOp(inputFrameworkOpName = "LeftShift",opName = "shift_bits", - tensorNames = mutableMapOf("input" to "x","y" to "y"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - -val linspace = mapTensorNamesWithOp(inputFrameworkOpName = "LinSpace",opName = "lin_space", - tensorNames = mutableMapOf("start" to "start","finish" to "stop","numOfElements" to "num"), - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf( - "start" to "start", - "stop" to "stop")), - valueMapping(mutableMapOf("dataType" to "T")) - )) - -//0=tanh, 1=relu, 2=sigmoid, 3=affine, 4=leaky relu, 5= thresholded relu, 6=scaled tanh, 7=hard sigmoid, 8=ELU, 9=softsign, 10=softplus - -val lstmActivationMap = mapOf( - "Relu" to 1, - "Tanh" to 0, - "Sigmoid" to 2, - "Affine" to 3, - "LeakyRelu" to 4, - "ThresholdedRelu" to 5, - "ScaledTanh" to 6, - "HardSigmoid" to 7, - "Elu" to 8, - "Softsign" to 9, - "Softplus" to 10 -) - -val lstmBlock = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "BlockLSTM", - opName = "lstmBlock", - tensorMappingRules = listOf( - mappingNDArrayInputs(mutableMapOf( - "maxTSLength" to "seq_len_max", - "input" to "x", - "cLast" to "cs_prev", - "yLast" to "h_prev", - "W" to "w", - "Wci" to "wci", - "Wcf" to "wcf", - "Wco" to "wco", - "b" to "b")) - ), - attributeMappingRules = listOf( - valueMapping(mutableMapOf("forgetBias" to "forget_bias","clippingCellValue" to "cell_clip")), - invertBooleanNumber(mutableMapOf("peephole" to "use_peephole")), - intConstant(inputName = "dataFormat",constantValue = 0 as Integer,argumentIndex = 0)[0]) -) - -val lstmBlockV2 = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "BlockLSTMV2", - opName = "lstmBlock", - tensorMappingRules = listOf( - mappingNDArrayInputs(mutableMapOf( - "maxTSLength" to "seq_len_max", - "input" to "x", - "cLast" to "cs_prev", - "yLast" to "h_prev", - "W" to "w", - "Wci" to "wci", - "Wcf" to "wcf", - "Wco" to "wco", - "b" to "b")) - ), - attributeMappingRules = listOf( - valueMapping(mutableMapOf("clippingCellValue" to "cell_clip")), - invertBooleanNumber(mutableMapOf("peephole" to "use_peephole")), - doubleConstant(inputName = "forgetBias",constantValue = 3.0,argumentIndex = 0)[0], - intConstant(inputName = "dataFormat",constantValue = 0 as Integer,argumentIndex = 0)[0]) -) - -val lstmBlockCell = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "LSTMBlockCell", - opName = "lstmBlockCell", - tensorMappingRules = listOf( - mappingNDArrayInputs(mutableMapOf( - "xt" to "x", - "cLast" to "cs_prev", - "yLast" to "h_prev", - "W" to "w", - "Wci" to "wci", - "Wcf" to "wcf", - "Wco" to "wco", - "b" to "b")) - ), - attributeMappingRules = listOf( - valueMapping(mutableMapOf("forgetBias" to "forget_bias","clippingCellValue" to "cell_clip")), - invertBooleanNumber(mutableMapOf("peephole" to "use_peephole"))) -) - -val gruCell = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "GRUBlockCell", - opName = "gruCell", - tensorMappingRules = listOf( - mappingNDArrayInputs(mutableMapOf( - "input" to "x", - "hLast" to "h_prev", - "Wru" to "w_ru", - "Wc" to "w_c", - "bru" to "b_ru", - "bc" to "b_c")) - ) -) - -val listDiff = mapTensorNamesWithOp(inputFrameworkOpName = "ListDiff",opName = "listdiff",tensorNames = mutableMapOf("values" to "x","keep" to "y")) -val logMatrixDeterminant = mapTensorNamesWithOp( - inputFrameworkOpName = "LogMatrixDeterminant", - opName = "log_matrix_determinant", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - -val logicalAnd = mapTensorNamesWithOp(inputFrameworkOpName = "LogicalAnd",opName = "boolean_and",tensorNames = mutableMapOf("input" to "x","y" to "y")) -val logicalNot = mapTensorNamesWithOp(inputFrameworkOpName = "LogicalNot",opName = "boolean_not",tensorNames = mutableMapOf("input" to "x")) - -val lu = mapTensorNamesWithOp(inputFrameworkOpName = "Lu",opName = "lu",tensorNames = mutableMapOf("input" to "input")) -val gemm = multipleNameMapping(inputFrameworkOpNames = listOf("MatMul"),opName = "matmul", - tensorNames = mutableMapOf("input" to "a","y" to "b"), - attributeMappingRules = - listOf(doubleConstant(inputName = "alpha",constantValue = 1.0,argumentIndex = 0)[0], - doubleConstant(inputName = "beta",constantValue = 1.0,argumentIndex = 1)[0], - invertBooleanNumber(mutableMapOf("transX" to "transpose_a","transY" to "transpose_b")), - intConstant(inputName = "transZ",constantValue = 0 as Integer,argumentIndex = 2)[0])) - - -val matrixSetDiag = multipleNameMapping(inputFrameworkOpNames = listOf("MatrixSetDiag","BatchMatrixSetDiag"), - opName = "matrix_set_diag", - tensorNames = mutableMapOf("input" to "input","diagonal" to "diagonal"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) -val matrixSetDiagPart = multipleNameMapping(inputFrameworkOpNames = listOf("MatrixDiagPart"), - opName = "matrix_diag_part", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0), - tensorNames = mutableMapOf("input" to "input")) - -val matrixSolve = mapTensorNamesWithOp(inputFrameworkOpName = "MatrixSolve",opName = "solve",tensorNames = mutableMapOf("a" to "matrix","b" to "rhs"), - attributeMappingRules = listOf(valueMapping(mapOf("useAdjoint" to "adjoint")))) -val matrixTriangularSolve = mapTensorNamesWithOp(inputFrameworkOpName = "MatrixTriangularSolve",opName = "triangular_solve",tensorNames = -mutableMapOf("a" to "matrix","b" to "rhs"), - attributeMappingRules = listOf(valueMapping(mapOf("useAdjoint" to "adjoint","isLower" to "lower")))) - - -val matrixDeterminant = multipleNameMapping(inputFrameworkOpNames = listOf("BatchMatrixDeterminant","MatrixDeterminant"),opName = "matrix_determinant", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - -val minPairWise = mapTensorNamesWithOp(inputFrameworkOpName = "Minimum", - opName = "min_pairwise", - tensorNames = mutableMapOf("input" to "x","y" to "y")) - - -val maxPool = multipleNameMapping( - inputFrameworkOpNames = listOf("MaxPool"), - opName = "maxpool2d", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf( - intConstant(inputName = "pH",constantValue = 0 as Integer,argumentIndex = 4)[0], - intConstant(inputName = "pW",constantValue = 0 as Integer,argumentIndex = 5)[0], - intConstant(inputName = "dW",constantValue = 1 as Integer,argumentIndex = 6)[0], - intConstant(inputName = "dH",constantValue = 1 as Integer,argumentIndex = 7)[0], - intConstant(inputName = "extraParam0",constantValue = 0 as Integer,argumentIndex = 9)[0], - stringNotEqualsRule(outputAttribute = "isNCHW",inputFrameworkAttributeName = "data_format",valueToTest = "NCHW",argumentIndex = 10), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 8), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 2), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 3), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kH", attributeNameOfListAttribute = "ksize", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 0), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kW", attributeNameOfListAttribute = "ksize", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 1) - ) -) - -val maxPoolV2 = multipleNameMapping( - inputFrameworkOpNames = listOf("MaxPoolV2"), - opName = "maxpool2d", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf( - intConstant(inputName = "extraParam0",constantValue = 0 as Integer,argumentIndex = 9)[0], - intConstant(inputName = "pH",constantValue = 0 as Integer,argumentIndex = 4)[0], - intConstant(inputName = "pW",constantValue = 0 as Integer,argumentIndex = 5)[0], - intConstant(inputName = "dW",constantValue = 1 as Integer,argumentIndex = 6)[0], - intConstant(inputName = "dH",constantValue = 1 as Integer,argumentIndex = 7)[0], - stringNotEqualsRule(outputAttribute = "isNCHW",inputFrameworkAttributeName = "data_format",valueToTest = "NCHW",argumentIndex = 10), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 8), - conditionalFieldValueIntIndexNDArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 2), - conditionalFieldValueIntIndexNDArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 3), - conditionalFieldValueIntIndexNDArrayRule(outputAttribute = "kH", attributeNameOfListAttribute = "ksize", targetValue = "NCHW", trueIndex = 2, - falseIndex = 1,inputFrameworkStringNameToTest = "data_format",argumentIndex = 0), - conditionalFieldValueIntIndexNDArrayRule(outputAttribute = "kW", attributeNameOfListAttribute = "ksize", targetValue = "NCHW", trueIndex = 3, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 1) - ) -) - - -val maxPool3d = TensorflowMappingProcess( - inputFrameworkOpName = "MaxPool3D", - opName = "maxpool3dnew", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf( - stringEqualsRule(outputAttribute = "isNCDHW",inputFrameworkAttributeName = "data_format",valueToTest = "NDHWC",argumentIndex = 14), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME",argumentIndex = 12), - intConstant(inputName = "pD",constantValue = 1 as Integer,argumentIndex = 6)[0], - intConstant(inputName = "pH",constantValue = 1 as Integer,argumentIndex = 7)[0], - intConstant(inputName = "pW",constantValue = 1 as Integer,argumentIndex = 8)[0], - intConstant(inputName = "dD",constantValue = 1 as Integer,argumentIndex = 9)[0], - intConstant(inputName = "dH",constantValue = 1 as Integer,argumentIndex = 10)[0], - intConstant(inputName = "dW",constantValue = 1 as Integer,argumentIndex = 11)[0], - intConstant(inputName = "extraParam0",constantValue = 0 as Integer,argumentIndex = 13)[0], - - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NDHWC", trueIndex = 2, - falseIndex = 4,inputFrameworkStringNameToTest = "data_format",argumentIndex = 4), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NDHWC", trueIndex = 4, - falseIndex = 5,inputFrameworkStringNameToTest = "data_format",argumentIndex = 5), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sD", attributeNameOfListAttribute = "ksize", targetValue = "NDHWC", trueIndex = 1, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 3), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kH", attributeNameOfListAttribute = "ksize", targetValue = "NDHWC", trueIndex = 2, - falseIndex = 4,inputFrameworkStringNameToTest = "data_format",argumentIndex = 1), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kW", attributeNameOfListAttribute = "ksize", targetValue = "NDHWC", trueIndex = 4, - falseIndex = 5,inputFrameworkStringNameToTest = "data_format",argumentIndex = 2), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kD", attributeNameOfListAttribute = "ksize", targetValue = "NDHWC", trueIndex = 1, - falseIndex = 2,inputFrameworkStringNameToTest = "data_format",argumentIndex = 0) - ) -) -//TODO: potentially need more features to be compatible? -/* -val maxPoolWithArgMax = multipleNameMapping( - inputFrameworkOpNames = listOf("MaxPoolWithArgmax"), - opName = "max_pool_with_argmax", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf( - stringEqualsRule(outputAttribute = "isNHWC",inputFrameworkAttributeName = "data_format",valueToTest = "NWHC"), - stringEqualsRule(outputAttribute = "isSameMode",inputFrameworkAttributeName = "padding",valueToTest = "SAME"), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sH", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 2, falseIndex = 1,inputFrameworkStringNameToTest = "data_format"), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "sW", attributeNameOfListAttribute = "strides", targetValue = "NCHW", trueIndex = 3, falseIndex = 2,inputFrameworkStringNameToTest = "data_format"), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kH", attributeNameOfListAttribute = "ksize", targetValue = "NCHW", trueIndex = 2, falseIndex = 1,inputFrameworkStringNameToTest = "data_format"), - conditionalFieldValueIntIndexArrayRule(outputAttribute = "kW", attributeNameOfListAttribute = "ksize", targetValue = "NCHW", trueIndex = 3, falseIndex = 2,inputFrameworkStringNameToTest = "data_format") - ) -)*/ - -//TODO: Not likely correct. Need to figure out true mapping. Likely an implicit control flow op? -val loopCond = mapTensorNamesWithOp(inputFrameworkOpName = "LoopCond",opName = "loop_cond",tensorNames = mutableMapOf()) -val merge = mapTensorNamesWithOp(inputFrameworkOpName = "Merge",opName = "merge",tensorNames = mutableMapOf("a" to "inputs","b" to "inputs")) - -val mirrorPadding = mapTensorNamesWithOp(inputFrameworkOpName = "MirrorPad",opName = "mirror_pad", - tensorNames = mutableMapOf("input" to "input","paddings" to "paddings"), - attributeMappingRules = listOf(stringNotEqualsRule(outputAttribute = "mode", - inputFrameworkAttributeName = "mode",valueToTest = "REFLECT",argumentIndex = 0), - booleanConstant(inputName = "isSymmetric",constantValue = true,argumentIndex = 0)[0])) - -/** - * TODO: Need to add a constant mapping or something for NonMaxSuppression - * v1 and 2 which do not have a scoreThreshold to map. V3 does. - */ - -val nonMaxSuppressionV1 = multipleNameMapping(inputFrameworkOpNames = listOf("NonMaxSuppression"), - opName = "non_max_suppression", - tensorNames = mutableMapOf("boxes" to "boxes","scales" to "scores", - "maxOutputSize" to "max_output_size"), - attributeMappingRules = listOf( - argDescriptorConstant(listOf( - ArgDescriptor { - doubleValue = 0.5 - name = "scoreThreshold" - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - argIndex = 1 - } - )), - valueMapping(mutableMapOf("iouThreshold" to "iou_threshold")), - convertNDArrayInputToNumericalAttr(mutableMapOf("maxOutputSize" to "max_output_size")))) - - - -val nonMaxSuppressionV2 = multipleNameMapping(inputFrameworkOpNames = listOf("NonMaxSuppressionV2"), - opName = "non_max_suppression", - tensorNames = mutableMapOf("boxes" to "boxes","scales" to "scores", - "overlayThreshold" to "iou_threshold","maxOutputSize" to "max_output_size"), - attributeMappingRules = listOf( - argDescriptorConstant(listOf( - ArgDescriptor { - doubleValue = 0.5 - name = "scoreThreshold" - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - argIndex = 1 - } - )), - convertNDArrayInputToNumericalAttr(mutableMapOf( - "maxOutputSize" to "max_output_size" - )))) - - -val nonMaxSuppressionV3 = multipleNameMapping(inputFrameworkOpNames = listOf("NonMaxSuppressionV3","NonMaxSuppressionV4"), - opName = "non_max_suppression_v3", - tensorNames = mutableMapOf("boxes" to "boxes","scales" to "scores", - "maxOutSize" to "max_output_size", "iouThreshold" to "iou_threshold", "scoreThreshold" to "score_threshold"), - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf( - "maxOutputSize" to "max_output_size" - )))) - - -val matrixInverse = multipleNameMapping(inputFrameworkOpNames = listOf("MatrixInverse","BatchMatrixInverse"),opName = "matrix_inverse", - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = true,argumentIndex = 0), - tensorNames = mutableMapOf("input" to "input")) - -//TODO: There might be a subtle difference in the way max threshold is interpreted. -//Tensorflow gives exact number back, whereas we may give back less. -//See the non_max_suppression_overlaps test case in TestTensorflowIR -val nonMaxSuppressionOverlaps = multipleNameMapping(inputFrameworkOpNames = listOf("NonMaxSuppressionWithOverlaps"), - opName = "non_max_suppression_overlaps", - tensorNames = mutableMapOf("scales" to "scores","boxes" to "overlaps"), - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf( - "maxOutputSize" to "max_output_size", - "overlapThreshold" to "overlap_threshold", - "scoreThreshold" to "score_threshold")))) - -val nthElement = mapTensorNamesWithOp(inputFrameworkOpName = "NthElement",opName = "nth_element", - tensorNames = mutableMapOf("n" to "n","input" to "input"), - attributeMappingRules = listOf(invertBooleanNumber(mapOf("reverse" to "reverse")))) - -val oneHot = mapTensorNamesWithOp(inputFrameworkOpName = "OneHot",opName = "onehot", - tensorNames = mutableMapOf("input" to "indices"), - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf("on" to "on_value","off" to "off_value" - ,"depth" to "depth")), - valueMapping(mutableMapOf("dimensions" to "axis","dataType" to "T")))) - - -val or = mapTensorNamesWithOp(inputFrameworkOpName = "LogicalOr",opName = "boolean_or", - tensorNames = mutableMapOf("input" to "x","y" to "y")) - -val onesLike = mapTensorNamesWithOp(inputFrameworkOpName = "OnesLike", - opName = "ones_as", - attributeMappingRules = listOf(valueMapping(mutableMapOf("dataType" to "T"))), - tensorNames = mutableMapOf("input" to "x")) - - - -val pow = mapTensorNamesWithOp(inputFrameworkOpName = "Pow",opName = "pow", - attributeMappingRules = listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("pow" to "y"))), - tensorNames = mutableMapOf("input" to "x","pow" to "y") -) - -val rank = mapTensorNamesWithOp(inputFrameworkOpName = "Rank", opName = "rank",tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf(argDescriptorConstant(listOf(ArgDescriptor { - name = "inPlace" - boolValue = false - argType = OpNamespace.ArgDescriptor.ArgType.BOOL - argIndex = 0 - - })))) - -val relu6 = multipleNameMapping(inputFrameworkOpNames = listOf("Relu6"),opName = "relu6", - attributeMappingRules = listOf(argDescriptorConstant( - listOf(ArgDescriptor { - name = "inPlace" - boolValue = false - argType = OpNamespace.ArgDescriptor.ArgType.BOOL - argIndex = 0 - }, - ArgDescriptor { - name = "cutoff" - doubleValue = 0.0 - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - argIndex = 0 - }) - )), - tensorNames = mutableMapOf("input" to "features")) - -val stack = multipleNameMapping(inputFrameworkOpNames = listOf("Pack"),opName = "stack", - attributeMappingRules = listOf(valueMapping(mutableMapOf("dimensions" to "axis"))), - tensorNames = mutableMapOf("input" to "values")) - -/** - * // in case of REFLECT and SYMMETRIC modes paddings must obey additional shape requirements -if (INT_ARG(0) == 0) { // CONSTANT mode -if(block.width() > 2) { -REQUIRE_TRUE(input->dataType() == INPUT_VARIABLE(2)->dataType(), 0, "PAD op: data types of input and padValue arrays should be the same but got %i and %i correspondingly !", input->dataType(), INPUT_VARIABLE(2)->dataType()); -padValue.assign(INPUT_VARIABLE(2)->e(0)); -} -else if (!block.getTArguments()->empty()) -padValue = T_ARG(0); -} -else if(INT_ARG(0) == 1) { // REFLECT mode -for(int dim=0; dim < rank; ++dim) -REQUIRE_TRUE(paddings->e(dim,0) <= (input->shapeOf()[dim]-1) && paddings->e(dim,1) <= (input->shapeOf()[dim]-1), 0, "PAD op: wrong content of paddings array for REFLECT mode !"); -} -if(INT_ARG(0) == 2) { // SYMMETRIC mode -for(int dim=0; dim < rank; ++dim) -REQUIRE_TRUE(paddings->e(dim,0) <= input->shapeOf()[dim] && paddings->e(dim,1) <= input->shapeOf()[dim], 0, "PAD op: wrong content of paddings array for SYMMETRIC mode !"); -} - */ -val pad = multipleNameMapping(inputFrameworkOpNames = listOf("Pad"), - opName = "pad",tensorNames = mutableMapOf("input" to "input","paddings" to "paddings"),attributeMappingRules = - listOf(argDescriptorConstant(listOf( - ArgDescriptor { - //note: tensorflow only supports constant mode - name = "mode" - int64Value = 0 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - argIndex = 0 - }, - ArgDescriptor { - name = "padValue" - doubleValue = 0.0 - argIndex = 0 - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - argIndex = 0 - } - )))) - - -val padV2 = multipleNameMapping(inputFrameworkOpNames = listOf("PadV2"), - opName = "pad",tensorNames = mutableMapOf("input" to "input","paddings" to "paddings"), - attributeMappingRules = - listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("padValue" to "constant_values")), - argDescriptorConstant(listOf( - ArgDescriptor { - //note: tensorflow only supports constant mode - name = "mode" - int64Value = 0 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - argIndex = 0 - } - )))) - - -val randomCrop = mapTensorNamesWithOp(inputFrameworkOpName = "RandomCrop",opName = "random_crop",tensorNames = mutableMapOf("input" to "image","shape" to "size"), - attributeMappingRules = listOf(valueMapping(mutableMapOf("seed" to "seed")))) - -val placeHolder = mapTensorNamesWithOp(inputFrameworkOpName = "Placeholder",opName = "placeholder",tensorNames = mutableMapOf()) - -val randomGamma = mapTensorNamesWithOp(inputFrameworkOpName = "RandomGamma",opName = "random_gamma",tensorNames = mutableMapOf("shape" to "shape","alpha" to "alpha"), - attributeMappingRules = listOf(valueMapping(mutableMapOf("seed" to "seed")))) - - -val rgvToHsv = mapTensorNamesWithOp(inputFrameworkOpName = "RGBToHSV",opName = "rgb_to_hsv",tensorNames = mutableMapOf("input" to "images"), - attributeMappingRules = intConstant(inputName = "dimC",constantValue = 0 as Integer,argumentIndex = 0)) - -val randomPoisson = multipleNameMapping(inputFrameworkOpNames = listOf("RandomPoisson","RandomPoissonV2"),opName = "random_poisson", - attributeMappingRules = listOf(valueMapping(mutableMapOf("seed" to "seed"))), - tensorNames = mutableMapOf("shape" to "shape","lambda" to "rate")) - -val randomShuffle = mapTensorNamesWithOp(inputFrameworkOpName = "RandomShuffle",opName = "random_shuffle", - tensorNames = mutableMapOf("input" to "value"), - attributeMappingRules = listOf(valueMapping(mutableMapOf("seeds" to "seed")))) - -//TODO: Look at extra arguments generated like T_ARG(1)); -val randomStandardNormal = multipleNameMapping(inputFrameworkOpNames = listOf("RandomStandardNormal"),opName = "random_normal", - tensorNames = mutableMapOf("input" to "shape")) - -//note: tensorflow hard codes the value at 0 to 1 while we allow customization here -val randomUniformHardCoded = multipleNameMapping( - inputFrameworkOpNames = listOf("RandomUniform","StatelessRandomUniform"), - opName = "randomuniform", - tensorNames = mutableMapOf("shape" to "shape"), - attributeMappingRules = listOf( - dataTypeToInt(mutableMapOf("dataType" to "T")), - valueMapping(mutableMapOf("dtype" to "T")), - argDescriptorConstant(listOf( - ArgDescriptor { - name = "min" - doubleValue = 0.0 - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - argIndex = 0 - }, - ArgDescriptor { - name = "max" - doubleValue = 1.0 - argType = OpNamespace.ArgDescriptor.ArgType.DOUBLE - argIndex = 1 - }, - ArgDescriptor { - name = "min" - argIndex = 1 - inputValue = nameSpaceTensorFromNDarray(Nd4j.scalar(1.0)) - argType = OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR - }, - ArgDescriptor { - name = "max" - argIndex = 2 - inputValue = nameSpaceTensorFromNDarray(Nd4j.scalar(1.0)) - argType = OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR - } - ))) -) - -val randomUniformInt = TensorflowMappingProcess( - inputFrameworkOpName = "RandomUniformInt", - opName = "randomuniform", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("shape" to "shape","min" to "minval","max" to "maxval"))), - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf("min" to "minval","max" to "maxval")), - dataTypeToInt(mutableMapOf("dataType" to "T")) - ), - opMappingRegistry = tensorflowOpRegistry -) - - -val range = multipleNameMapping(inputFrameworkOpNames = listOf("Range"),opName = "range", - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf("from" to "start", - "to" to "limit","step" to "delta"))), - tensorNames = mutableMapOf("from" to "start","to" to "limit","step" to "delta")) - -val relu = mapTensorNamesWithOp(inputFrameworkOpName = "Relu",opName = "relu",tensorNames = mutableMapOf("input" to "features"), - attributeMappingRules = listOf(doubleConstant(inputName = "cutoff",constantValue = 0.0,argumentIndex = 0)[0], - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0])) - - -val reshape = multipleNameMapping(inputFrameworkOpNames = listOf("Reshape"),opName = "reshape", - tensorNames = mutableMapOf("input" to "tensor","shape" to "shape"), - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("shapeArr" to "shape")))) - -val resizeArea = multipleNameMapping(inputFrameworkOpNames = listOf("ResizeArea"),opName = "resize_area", - attributeMappingRules = listOf(valueMapping(mutableMapOf("alignCorners" to "align_corners"))), - tensorNames = mutableMapOf("image" to "images","size" to "size")) - -val resizeBiCubic = multipleNameMapping(inputFrameworkOpNames = listOf("ResizeBicubic"),opName = "resize_bicubic", - attributeMappingRules = listOf(valueMapping(mutableMapOf("alignCorners" to "align_corners")), - booleanConstant(inputName = "alignPixelCenters",constantValue = false,argumentIndex = 1)[0]), - tensorNames = mutableMapOf("image" to "images","size" to "size")) - -val resizeBiLinear = multipleNameMapping(inputFrameworkOpNames = listOf("ResizeBilinear"),opName = "resize_bilinear", - attributeMappingRules = listOf(valueMapping(mutableMapOf("alignCorners" to "align_corners")), - booleanConstant(inputName = "halfPixelCenter",constantValue = false,argumentIndex = 1)[0]), - tensorNames = mutableMapOf("image" to "images","newImageSize" to "size")) - -val resizeNearestNeighbor = multipleNameMapping(inputFrameworkOpNames = listOf("ResizeNearestNeighbor"),opName = "resize_nearest_neighbor", - attributeMappingRules = listOf(valueMapping(mutableMapOf("alignCorners" to "align_corners")), - booleanConstant(inputName = "halfPixelCenter",constantValue = false,argumentIndex = 1)[0]), - tensorNames = mutableMapOf("image" to "images","newImageSize" to "size")) - -val reverse = multipleNameMapping(inputFrameworkOpNames = listOf("ReverseV2"),opName = "reverse", - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("dimensions" to "axis"))), - tensorNames = mutableMapOf("input" to "tensor")) - -val reverseSequence = multipleNameMapping(inputFrameworkOpNames = listOf("ReverseSequence"),opName = "reverse_sequence", - attributeMappingRules = listOf(valueMapping(mutableMapOf("batchDim" to "batch_dim","seqDim" to "seq_dim"))), - tensorNames = mutableMapOf("input" to "input","seqLengths" to "seq_lengths")) - -val roll = multipleNameMapping(inputFrameworkOpNames = listOf("Roll"),opName = "roll", - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("shift" to "shift"))), - tensorNames = mutableMapOf("input" to "input","dimensions" to "axis","shiftsI" to "shift")) - -//TODO: verify usingLocking property, it's not showing up in descriptors -val scatterAdd = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterAdd"),opName = "scatter_add", - tensorNames = mutableMapOf("input" to "tensor","indices" to "indices","updates" to "updates"), - attributeMappingRules = - listOf(booleanConstant(inputName = "lock",constantValue = false,0)[0], - booleanConstant(inputName = "checkIndices",constantValue = false,argumentIndex = 1)[0])) - -/* -val scatterDiv = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterDiv"),opName = "scatter_div", - tensorNames = mutableMapOf("input" to "tensor","indices" to "indices","updates" to "updates")) -*/ - -/* -val scatterMax = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterMax"),opName = "scatter_max", - tensorNames = mutableMapOf("input" to "tensor","indices" to "indices","updates" to "updates")) -*/ - - -/* -val scatterMin = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterMin"),opName = "scatter_min", - tensorNames = mutableMapOf("input" to "tensor","indices" to "indices","updates" to "updates")) -*/ - -/* -val scatterMul = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterMul"),opName = "scatter_mul", - tensorNames = mutableMapOf("input" to "tensor","indices" to "indices","updates" to "updates")) -*/ - -val scatterNd = multipleNameMapping(inputFrameworkOpNames = listOf("ScatterNd"),opName = "scatter_nd", - tensorNames = mutableMapOf("indices" to "indices","updates" to "updates","shape" to "shape"), - attributeMappingRules = listOf( - booleanConstant(inputName = "lock",constantValue = false,argumentIndex = 0)[0], - booleanConstant(inputName = "checkIndices",constantValue = false,argumentIndex = 1)[0])) - -/* -val scatterNdAdd = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterNdAdd"),opName = "scatter_nd_add", - tensorNames = mutableMapOf("indices" to "indices","updates" to "updates","input" to "tensor")) -*/ - -/* -val scatterNdSub = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterNdSub"),opName = "scatter_nd_sub", - tensorNames = mutableMapOf("indices" to "indices","updates" to "updates","input" to "tensor")) -*/ - -/* -val scatterNdUpdate = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterNdUpdate"),opName = "scatter_nd_update", - tensorNames = mutableMapOf("indices" to "indices","updates" to "updates","input" to "tensor")) -*/ - - -val scatterSub = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterSub"), - opName = "scatter_sub", - tensorNames = mutableMapOf("indices" to "indices", - "updates" to "updates","input" to "tensor"), - attributeMappingRules = listOf( - booleanConstant(inputName = "lock",constantValue = false, - argumentIndex = 0)[0], - booleanConstant(inputName = "checkIndices",constantValue = false, - argumentIndex = 1)[0])) - -//TODO: note: TF expects indices, we don't support them? -val scatterUpdate = multipleNameMapping(inputFrameworkOpNames = listOf("TensorScatterUpdate"),opName = "scatter_update", - attributeMappingRules = listOf(intConstant(inputName = "dimension",constantValue = 0 as Integer,argumentIndex = 1)[0], - ndarrayToIntList(mutableMapOf("indices" to "indices"))), - tensorNames = mutableMapOf("operand" to "tensor","updates" to "updates")) - -val select = mapTensorNamesWithOp(inputFrameworkOpName = "Select",opName = "select",tensorNames = mutableMapOf("cond" to "condition","input" to "t","y" to "e")) - -val segmentMean = multipleNameMapping(inputFrameworkOpNames = listOf("SegmentMean"),opName = "segment_mean", - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - -val segmentMin = multipleNameMapping(inputFrameworkOpNames = listOf("SegmentMin"),opName = "segment_min", - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - - -val segmentMax = multipleNameMapping(inputFrameworkOpNames = listOf("SegmentMax"),opName = "segment_max", - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - - -val segmentProd = multipleNameMapping(inputFrameworkOpNames = listOf("SegmentProd"),opName = "segment_prod", - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - -val segmentSum = multipleNameMapping(inputFrameworkOpNames = listOf("SegmentSum"),opName = "segment_sum", - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - -val size = TensorflowMappingProcess( - opMappingRegistry = tensorflowOpRegistry, - inputFrameworkOpName = "Size", - opName = "size", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))) -) - -val slice = mapTensorNamesWithOp(inputFrameworkOpName = "Slice",opName = "slice", - tensorNames = mutableMapOf("input" to "input","b" to "begin","e" to "size"), - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("size" to "size")))) - -val selu = mapTensorNamesWithOp(inputFrameworkOpName = "Selu",opName = "selu",tensorNames = mutableMapOf("input" to "features"), - attributeMappingRules = - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - -val shapeOf = mapTensorNamesWithOp(inputFrameworkOpName = "Shape", - opName = "shape_of", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf( - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0], - valueMapping(mutableMapOf("dtype" to "out_type")))) - -val softPlus = mapTensorNamesWithOp(inputFrameworkOpName = "Softplus",opName = "softplus",tensorNames = mutableMapOf("input" to "features"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) -val softSign = mapTensorNamesWithOp(inputFrameworkOpName = "Softsign",opName = "softsign",tensorNames = mutableMapOf("input" to "features"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - -val shapeN = mapTensorNamesWithOp(inputFrameworkOpName = "ShapeN",opName = "shapes_of",tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - -val softMax = mapTensorNamesWithOp(inputFrameworkOpName = "Softmax",opName = "softmax",tensorNames = mutableMapOf("input" to "logits"),attributeMappingRules = -listOf(argDescriptorConstant( - listOf( - ArgDescriptor { - name = "dimension" - int64Value = 1 - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - argIndex = 0 - }, - ArgDescriptor { - name = "inPlace" - boolValue = false - argType = OpNamespace.ArgDescriptor.ArgType.BOOL - argIndex = 0 - } - ) -))) - - - - -val spaceToBatch = TensorflowMappingProcess( - opName = "space_to_batch", - inputFrameworkOpName = "SpaceToBatch", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf( - valueMapping(mapOf("blockSize" to "block_size"))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input","padding" to "paddings"))) -) - -val spaceToBatchNd = TensorflowMappingProcess( - opName = "space_to_batch_nd", - inputFrameworkOpName = "SpaceToBatchND", - opMappingRegistry = tensorflowOpRegistry, - attributeMappingRules = listOf( - ndarrayToIntList(mutableMapOf("blocks" to "block_shape")), - argDescriptorConstant(listOf( - ArgDescriptor { - name = "inPlace" - boolValue = false - argType = OpNamespace.ArgDescriptor.ArgType.BOOL - argIndex = 0 - - } - ))), - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input","blockShape" to "block_shape","padding" to "paddings"))) -) - -val spaceToDepth = TensorflowMappingProcess( - opName = "space_to_depth", - inputFrameworkOpName = "SpaceToDepth", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf(valueMapping(mapOf("block_size" to "block_size")), - stringEqualsRule("isNHWC",inputFrameworkAttributeName = "data_format",valueToTest = "NHWC",argumentIndex = 1)), - opMappingRegistry = tensorflowOpRegistry -) - -val split = TensorflowMappingProcess( - opName = "split", - inputFrameworkOpName = "Split", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("a" to "split_dim","b" to "value"))), - attributeMappingRules = listOf(valueMapping(mapOf("numSplit" to "num_split")) - , ndarrayToIntList(mutableMapOf("dimensions" to "split_dim"))), - opMappingRegistry = tensorflowOpRegistry -) - - -val splitV = TensorflowMappingProcess( - opName = "split_v", - inputFrameworkOpName = "SplitV", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf( - "input" to "value", - "sizes" to "size_splits", - "_a" to "split_dim"))), - attributeMappingRules = listOf( - valueMapping(mutableMapOf("numSplit" to "num_split")), - convertNDArrayInputToNumericalAttr(mutableMapOf("dimensions" to "split_dim")), - ndarrayToIntList(mutableMapOf("dimensions" to "split_dim"))), - opMappingRegistry = tensorflowOpRegistry -) - -val squeeze = TensorflowMappingProcess( - opName = "squeeze", - inputFrameworkOpName = "Squeeze", - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf( - listNumberToNDarray(mutableMapOf("a" to "squeeze_dims")), - listNumberToListNumber(outputAttributeValue = "_a",inputAttributeValue = "squeeze_dims")), - opMappingRegistry = tensorflowOpRegistry -) - -val stridedSlice = TensorflowMappingProcess( - opName = "strided_slice", - inputFrameworkOpName = "StridedSlice", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input", - "v_begin" to "begin", - "v_end" to "end", - "v_stride" to "strides"))), - attributeMappingRules = listOf( - valueMapping(mutableMapOf("begin_mask" to "begin_mask","end_mask" to "end_mask", - "ellipsis_mask" to "ellipsis_mask","new_axis_mask" to "new_axis_mask", - "shrink_axis_mask" to "shrink_axis_mask"))) -) - - -/* -val svd = TensorflowMappingProcess( - opName = "svd", - inputFrameworkOpName = "Svd", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "input"))), - attributeMappingRules = listOf(valueMapping(mutableMapOf("computeUv" to "compute_uv","fullMatrices" to "full_matrices"))) -) -*/ - -val switch = TensorflowMappingProcess( - opName = "Switch", - inputFrameworkOpName = "Switch", - opMappingRegistry = tensorflowOpRegistry, - tensorMappingRules = listOf(mappingNDArrayInputs(mutableMapOf("input" to "data","condition" to "pred"))) -) - - -//TODO: revisit this, not sure why the ops are off -val tensorArrayConcat = multipleNameMapping(inputFrameworkOpNames = listOf("TensorArrayConcat", "TensorArrayConcatV2", "TensorArrayConcatV3"), - opName = "stack_list", - tensorNames = mutableMapOf("list" to "flow_in")) - -//TODO: revisit this, not sure why the ops are off -val tensorArrayGather = multipleNameMapping(inputFrameworkOpNames = listOf("TensorArrayGather", "TensorArrayGatherV2", "TensorArrayGatherV3"), - opName = "gather_list", - tensorNames = mutableMapOf("indices" to "indices","list" to "flow_in")) -//TODO: revisit this, not sure why the ops are off -/*val tensorArrayPack = multipleNameMapping(inputFrameworkOpNames = listOf("TensorArrayPack", "TensorArrayPackV2", "TensorArrayPackV3"), - opName = "tensorarraypackv3", - tensorNames = mutableMapOf("indices" to "indices"))*/ -//TODO: revisit this, not sure why the ops are off - -val tensorArrayRead = multipleNameMapping(inputFrameworkOpNames = listOf("TensorArrayRead", "TensorArrayReadV2", "TensorArrayReadV3"), - opName = "read_list", - attributeMappingRules = listOf(ndarrayToIntList(mutableMapOf("index" to "index"))), - tensorNames = mutableMapOf("list" to "flow_in")) -//TODO: revisit this, not sure why the ops are off - -val tensorArrayScatter = multipleNameMapping(inputFrameworkOpNames = listOf("TensorArrayScatter", "TensorArrayScatterV2", "TensorArrayScatterV3"), - opName = "scatter_list", - tensorNames = mutableMapOf("array" to "value","sizes" to "indices","list" to "flow_in")) - -//TODO: revisit this, not sure why the ops are off - -val tensorArraySize = multipleNameMapping(inputFrameworkOpNames = listOf("TensorArraySize", "TensorArraySizeV2", "TensorArraySizeV3"), - opName = "size_list", - tensorNames = mutableMapOf("list" to "handle","list" to "flow_in")) - -//TODO: revisit this, not sure why the ops are off - -val tensorArraySplit = multipleNameMapping(inputFrameworkOpNames = listOf("TensorArraySplit", "TensorArraySplitV2", "TensorArraySplitV3"), - opName = "split_list", - tensorNames = mutableMapOf("sizes" to "lengths","list" to "value")) - -val tile = mapTensorNamesWithOp(inputFrameworkOpName = "Tile",opName = "tile", - attributeMappingRules = listOf(intConstant(inputName = "dimensions",constantValue = 0 as Integer,argumentIndex = 0)[0], - booleanConstant(inputName = "is_static_reps",constantValue = true,argumentIndex = 0)[0]), - tensorNames = mutableMapOf("input" to "input","reps_vector" to "multiples")) - -val topk = multipleNameMapping(inputFrameworkOpNames = listOf("TopK"),opName = "top_k", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf(valueMapping(mutableMapOf("needSort" to "sorted","k" to "k")))) - -val topkV2 = multipleNameMapping(inputFrameworkOpNames = listOf("TopKV2"),opName = "top_k", - tensorNames = mutableMapOf("input" to "input"), - attributeMappingRules = listOf(valueMapping(mutableMapOf("needSort" to "sorted")), - convertNDArrayInputToNumericalAttr(mutableMapOf("k" to "k")))) - -val transpose = mapTensorNamesWithOp( - inputFrameworkOpName = "Transpose", - opName = "transpose", - tensorNames = mutableMapOf("input" to "x","permuteDims" to "perm")) - - -//note we don't allow unique with an axis argument -val unique = multipleNameMapping( - inputFrameworkOpNames = listOf("Unique","UniqueV2"), - opName = "unique", - tensorNames = mutableMapOf("input" to "x") -) - - -/** - * NOTE: Ours only supports vectors, not 2d. - */ -val uniqueWithCounts = multipleNameMapping( - inputFrameworkOpNames = listOf("UniqueWithCounts","UniqueWithCountsV2"), - opName = "unique_with_counts", - tensorNames = mutableMapOf("input" to "x") -) - -val unpack = multipleNameMapping(inputFrameworkOpNames = listOf("Unpack"), - opName = "unstack", - tensorNames = mutableMapOf("input" to "value"), - attributeMappingRules = listOf(valueMapping(mutableMapOf("dimensions" to "axis","num" to "num")))) - - -val unsortedSegmentMax = mapTensorNamesWithOp(inputFrameworkOpName = "UnsortedSegmentMax", - opName = "unsorted_segment_max", - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf("numSegments" to "num_segments","numSegments" to "num_segments"))), - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - -val unsortedSegmentMin = mapTensorNamesWithOp(inputFrameworkOpName = "UnsortedSegmentMin", - opName = "unsorted_segment_min", - attributeMappingRules = listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("numSegments" to "num_segments"))), - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - -val unsortedSegmentProd = mapTensorNamesWithOp(inputFrameworkOpName = "UnsortedSegmentProd", - opName = "unsorted_segment_prod", - attributeMappingRules = listOf( - convertNDArrayInputToNumericalAttr(mutableMapOf("numSegments" to "num_segments"))), - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - - -val unsortedSegmentSum = mapTensorNamesWithOp(inputFrameworkOpName = "UnsortedSegmentSum", - opName = "unsorted_segment_sum", - attributeMappingRules = listOf(convertNDArrayInputToNumericalAttr(mutableMapOf("numSegments" to "num_segments"))), - tensorNames = mutableMapOf("input" to "data","idxSegments" to "segment_ids")) - -//TODO: Figure out if need to map -val nextIteration = mapTensorNamesWithOp(inputFrameworkOpName = "NextIteration",opName = "next_iteration", - tensorNames = mutableMapOf("input" to "data")) - -val noOp = mapTensorNamesWithOp(inputFrameworkOpName = "NoOp",opName = "noop",tensorNames = mutableMapOf()) - -val where = mapTensorNamesWithOp(inputFrameworkOpName = "Where",opName = "Where", - tensorNames = mutableMapOf("condition" to "input") -) - -val whileOp = mapTensorNamesWithOp(inputFrameworkOpName = "While",opName = "While", - tensorNames = mutableMapOf("condition" to "input"), - attributeMappingRules = booleanConstant(inputName = "isConstant",constantValue = false,argumentIndex = 0) -) - -val zerosLike = mapTensorNamesWithOp(inputFrameworkOpName = "ZerosLike",opName = "zeroslike", - tensorNames = mutableMapOf("input" to "x"), - attributeMappingRules = listOf( - booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)[0], - valueMapping(mutableMapOf("dataType" to "T")) - )) - -val zeta = mapTensorNamesWithOp(inputFrameworkOpName = "Zeta",opName = "zeta", - tensorNames = mutableMapOf("input" to "x","q" to "q"), - attributeMappingRules = booleanConstant(inputName = "inPlace",constantValue = false,argumentIndex = 0)) - - -object TensorflowOpDeclarations { - init { - val groupedOps = tensorflowOps.opList.groupBy { input -> input.name } - val singleGroupedOps = HashMap() - groupedOps.forEach { name, node -> - singleGroupedOps[name] = node[0] - } - - OpRegistryHolder.registerOpList("tensorflow", singleGroupedOps) - tensorflowOps.opList.forEach { - tensorflowOpRegistry.registerInputFrameworkOpDef(it.name,it) - } - - nd4jOpDescriptors.opListList.forEach { - tensorflowOpRegistry.registerNd4jOpDef(it.name,it) - } - - reduceOps.forEach { tensorflowOpName, nd4jOpName -> - defineSingularReduce(inputFrameworkOpName = tensorflowOpName,inputOpName = nd4jOpName) - } - - - singleTransformArgs.forEach { - defineTensorflowSingleTransform(inputFrameworkOpName = it.key,inputOpName = it.value) - } - - elementWiseTransformOps.forEach { - defineTensorflowPairwiseTransforms(opName = it.value,inputFrameworkOpName = it.key) - } - - OpRegistryHolder.registerOpMappingRegistry("tensorflow", tensorflowOpRegistry) - - - - } -} - - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowProtobufExtensions.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowProtobufExtensions.kt deleted file mode 100644 index 8fdbb7bf7..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowProtobufExtensions.kt +++ /dev/null @@ -1,169 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import org.nd4j.shade.protobuf.ByteString -import org.tensorflow.framework.* -import java.nio.charset.Charset - -fun GraphDef.nodeByName(name: String): NodeDef { - val nodeNames = nodeList.map { node -> node.name } - return nodeList.first { it.name == name }!! -} - -fun ListAttrValue(vararg i: Long): AttrValue.ListValue = - AttrValue.ListValue.newBuilder().apply { - i.forEach { addI(it) } - }.build() - -fun TensorProto(block: TensorProto.Builder.() -> Unit): TensorProto { - return TensorProto.newBuilder().apply(block).build() -} - -fun TensorProto.Builder.RawData(byteArray: ByteArray) { - this.tensorContent = ByteString.copyFrom(byteArray) -} - -fun TensorProto.Builder.Shape(shape: List) { - this.tensorShape = TensorShapeProto { - Dims(shape) - } -} - -fun TensorProto.Builder.DataType(value: DataType) { - this.dtype = value -} - -fun TensorProto.Builder.String(value: String) { - this.addStringVal(ByteString.copyFrom(value.toByteArray(Charset.defaultCharset()))) -} - -fun TensorProto.Builder.StringData(value: List) { - this.addAllStringVal(value.map { value -> ByteString.copyFrom(value.toByteArray(Charset.defaultCharset())) }) -} - -fun TensorProto.Builder.Boolean(value: Boolean) { - this.addBoolVal(value) -} - -fun TensorProto.Builder.BooleanData(value: List) { - this.addAllBoolVal(value) -} - -fun TensorProto.Builder.Double(value: Double) { - this.addDoubleVal(value) -} - -fun TensorProto.Builder.Int64Data(value: List) { - this.addAllInt64Val(value) -} - - -fun TensorProto.Builder.Int32Data(value: List) { - this.addAllIntVal(value) -} - -fun TensorProto.Builder.DoubleData(value: List) { - this.addAllDoubleVal(value) -} - -fun TensorProto.Builder.Float(value: Float) { - this.addFloatVal(value) -} - -fun TensorProto.Builder.FloatData(value: List) { - this.addAllFloatVal(value) -} - -fun TensorShapeProto.Builder.Dim(name: String, size: Long) { - this.addDim(TensorShapeProto.Dim.newBuilder().setName(name).setSize(size).build()) -} - -fun Dim(block: TensorShapeProto.Dim.Builder.() -> Unit): TensorShapeProto.Dim { - return TensorShapeProto.Dim.newBuilder().apply(block).build() -} - -fun TensorShapeProto.Builder.Dims(shape: List) { - shape.forEachIndexed { index, value -> this.addDim( - Dim { - name = index.toString() - size = value - }) - } -} - -fun TensorShapeProto(block: TensorShapeProto.Builder.() -> Unit): TensorShapeProto { - return TensorShapeProto.newBuilder().apply(block).build() -} - -fun AttrValue(block: AttrValue.Builder.() -> Unit): AttrValue { - return AttrValue.newBuilder().apply(block).build() -} - - - -fun AttrValue.Builder.ListDataType(listDataTypes: List) { - this.listBuilder.addAllType(listDataTypes) -} - -fun AttrValue.Builder.ListInts(listInts: List) { - this.listBuilder.addAllI(listInts) -} - -fun AttrValue.Builder.LongVal(intVal: Long) { - this.i = intVal -} - -fun AttrValue.Builder.ListFloats(listFloats: List) { - this.listBuilder.addAllF(listFloats) -} - - - -fun GraphDef(block: GraphDef.Builder.() -> Unit): GraphDef { - return GraphDef.newBuilder().apply(block).build() -} - -fun GraphDef.Builder.Node(inputNode: NodeDef) { - this.addNode(inputNode) -} - -fun String.toByteString() = ByteString.copyFrom(this, Charset.defaultCharset()) - -fun OpDef(block: OpDef.Builder.() -> Unit): OpDef { - return OpDef.newBuilder().apply(block).build() -} - -fun NodeDef(block: NodeDef.Builder.() -> Unit): NodeDef { - return NodeDef.newBuilder().apply(block).build() -} - -fun ListValue(block: AttrValue.ListValue.Builder.() -> Unit): AttrValue.ListValue { - return AttrValue.ListValue.newBuilder().apply(block).build() -} - -fun AttrValue.ListValue.Builder.LongItems(value: List) { - this.addAllI(value) -} - -fun AttrValue.ListValue.Builder.IntItems(value: List) { - this.addAllI(value.map { it.toLong() }) -} - -fun AttrValue.ListValue.Builder.IntItem(value: Long) { - this.addI(value) -} - -fun NodeDef.Builder.Input(name: String) { - this.addInput(name) -} - -fun NodeDef.Builder.Attribute(name: String, value: AttrValue) { - this.putAttr(name, value) -} - -fun OpList.findOp(name: String): OpDef { - if(!this.opList.map { input -> input.name }.contains(name)) { - throw IllegalArgumentException("Op $name not found!") - } - return this.opList.first { it.name == name }!! -} - diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowRuleDeclarations.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowRuleDeclarations.kt deleted file mode 100644 index e9f9d1470..000000000 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/ir/tensorflow/TensorflowRuleDeclarations.kt +++ /dev/null @@ -1,1222 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import org.nd4j.codegen.ir.* -import org.nd4j.ir.OpNamespace -import org.tensorflow.framework.* - - -class TensorflowConditionalFieldValueIntIndexNDArrayRule - (mappingNamesToPerform: Map, transformerArgs: Map>) : - ConditionalFieldValueIntIndexNDArrayRule - (mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess< - GraphDef,OpDef, NodeDef, TensorProto, OpDef.AttrDef, AttrValue, DataType>): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } - - -} - -fun conditionalFieldValueIntIndexNDArrayRule(outputAttribute: String, - inputFrameworkStringNameToTest: String, - targetValue: String, - trueIndex: Int, - falseIndex: Int, - attributeNameOfListAttribute: String, - argumentIndex: Int): TensorflowConditionalFieldValueIntIndexNDArrayRule { - return TensorflowConditionalFieldValueIntIndexNDArrayRule( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkStringNameToTest), - transformerArgs = mapOf(outputAttribute to listOf( - ArgDescriptor { - name = "targetValue" - stringValue = targetValue - argIndex = argumentIndex - }, - ArgDescriptor { - name = "trueIndex" - int32Value = trueIndex - argIndex = argumentIndex - }, - ArgDescriptor { - name = "falseIndex" - int32Value = falseIndex - argIndex = argumentIndex - }, - ArgDescriptor { - name = "attributeNameOfListAttribute" - stringValue = attributeNameOfListAttribute - argIndex = argumentIndex - })) - ) -} - - - - - -class TensorflowConditionalFieldValueIntIndexArrayRule - (mappingNamesToPerform: Map, transformerArgs: Map>) : - ConditionalFieldValueIntIndexArrayRule - (mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess< - GraphDef,OpDef, NodeDef, TensorProto, OpDef.AttrDef, AttrValue, DataType>): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } - - -} - -fun conditionalFieldValueIntIndexArrayRule(outputAttribute: String, - inputFrameworkStringNameToTest: String, - targetValue: String, - trueIndex: Int, - falseIndex: Int, - attributeNameOfListAttribute: String, - argumentIndex: Int): TensorflowConditionalFieldValueIntIndexArrayRule { - return TensorflowConditionalFieldValueIntIndexArrayRule( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkStringNameToTest), - transformerArgs = mapOf(outputAttribute to listOf( - ArgDescriptor { - name = "targetValue" - stringValue = targetValue - argIndex = argIndex - }, - ArgDescriptor { - name = "trueIndex" - int32Value = trueIndex - argIndex = argumentIndex - }, - ArgDescriptor { - name = "falseIndex" - int32Value = falseIndex - argIndex = argumentIndex - }, - ArgDescriptor { - name = "attributeNameOfListAttribute" - stringValue = attributeNameOfListAttribute - argIndex = argumentIndex - })) - ) -} - -class TensorflowNDArraySizeAt(mappingNamesToPerform: Map, transformerArgs: Map>): - NDArraySizeAtRule(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun sizeAtRule(dimensionIndex: Int, - outputAttributeName: String, - inputFrameworkAttributeName: String, - argumentIndex: Int): TensorflowNDArraySizeAt { - return TensorflowNDArraySizeAt( - mappingNamesToPerform = mapOf(outputAttributeName to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttributeName to listOf(OpNamespace.ArgDescriptor.newBuilder().apply { - name = inputFrameworkAttributeName - int32Value = dimensionIndex - argIndex = argumentIndex - }.build())) - ) -} - -class TensorflowNDArrayExtractScalarValue(mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()) : - NDArrayExtractScalarValue - ( mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun ndarrayExtractScalarValue(outputAttribute: String, - inputFrameworkAttributeName: String, - argumentIndex: Int, - scalarIndex: Int): TensorflowNDArrayExtractScalarValue { - return TensorflowNDArrayExtractScalarValue( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf( - ArgDescriptor { - name = outputAttribute - int64Value = scalarIndex.toLong() - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - argIndex = argumentIndex - }))) -} - - - - -class TensorflowStringEqualsAdapterRule(mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()) : - StringEqualsAdapterRule - ( mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun stringEqualsRule(outputAttribute: String, - inputFrameworkAttributeName: String, - valueToTest: String, - argumentIndex: Int): TensorflowStringEqualsAdapterRule { - return TensorflowStringEqualsAdapterRule( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf( - ArgDescriptor { - name = inputFrameworkAttributeName - stringValue = valueToTest - argType = OpNamespace.ArgDescriptor.ArgType.STRING - argIndex = argumentIndex - }))) -} - - -class TensorflowStringNotEqualsAdapterRule(mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()) : - StringNotEqualsAdapterRule - ( mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, - mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, - mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun stringNotEqualsRule(outputAttribute: String, inputFrameworkAttributeName: String, valueToTest: String,argumentIndex: Int): TensorflowStringNotEqualsAdapterRule { - return TensorflowStringNotEqualsAdapterRule( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf(OpNamespace.ArgDescriptor.newBuilder().apply { - name = inputFrameworkAttributeName - stringValue = valueToTest - argIndex = argumentIndex - }.build()))) -} - - -class TensorflowStringContainsAdapterRule(mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()) : - StringContainsAdapterRule - ( mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun stringContainsRule(outputAttribute: String, inputFrameworkAttributeName: String, valueToTest: String): TensorflowStringContainsAdapterRule { - return TensorflowStringContainsAdapterRule( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName), - transformerArgs = mapOf(outputAttribute to listOf(OpNamespace.ArgDescriptor.newBuilder().apply { - name = inputFrameworkAttributeName - stringValue = valueToTest - }.build()))) -} - - -class TensorflowAttributeScalarNDArrayAttribute(mappingNamesToPerform: Map = emptyMap(), - transformerArgs: Map> = emptyMap()) : - AttributeScalarNDArrayAttribute - ( mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun attributeScalarToNDArrayInput(outputAttribute: String, inputFrameworkAttributeName: String): TensorflowAttributeScalarNDArrayAttribute { - return TensorflowAttributeScalarNDArrayAttribute( - mappingNamesToPerform = mapOf(outputAttribute to inputFrameworkAttributeName)) -} - - - - -class TensorflowValueMappingRule(mappingNamesToPerform: Map, transformerArgs: Map>) : - ValueMapping(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun valueMapping(mappings: Map): TensorflowValueMappingRule { - return TensorflowValueMappingRule(mappingNamesToPerform = mappings,transformerArgs = emptyMap()) -} - -class TensorflowInvertBooleanNumber(mappingNamesToPerform: Map, transformerArgs: Map>) : - InvertBooleanNumber(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef, attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun invertBooleanNumber(mappings: Map): TensorflowInvertBooleanNumber { - return TensorflowInvertBooleanNumber(mappingNamesToPerform = mappings,transformerArgs = emptyMap()) -} - - - - - -class TensorflowNDArrayToIntAttributeValue(mappingNamesToPerform: Map) : NDArrayToIntAttributeValue(mappingNamesToPerform = mappingNamesToPerform,transformerArgs = emptyMap()) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(attrDef,attributeValueType) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun ndarrayToIntList(ndarrayNameToAttributeName: MutableMap): TensorflowNDArrayToIntAttributeValue { - return TensorflowNDArrayToIntAttributeValue(mappingNamesToPerform = ndarrayNameToAttributeName) -} - -class TensorflowNdArrayToStringIndex(mappingNamesToPerform: Map, transformerArgs: Map>) : StringToInt(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun ndarrayStringToIndex(outputAttributeValue: String,inputAttributeValue: String, listOfValues: List,argumentIndex: Int): TensorflowNdArrayToStringIndex { - return TensorflowNdArrayToStringIndex(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = mapOf(outputAttributeValue to listOfValues.map { - valueName -> ArgDescriptor { - name = valueName - stringValue = valueName - argIndex = argumentIndex - } - })) -} - - -class TensorflowMapStringToInt(mappingNamesToPerform: Map, transformerArgs: Map>) : MapStringToInt(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun mapStringToInt(outputAttributeValue: String, inputAttributeValue: String, mapOfValuesToInts: Map,argumentIndex: Int,lookupIndex:Int): TensorflowMapStringToInt { - return TensorflowMapStringToInt(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = - mapOf(outputAttributeValue to mapOfValuesToInts.map { - entry -> ArgDescriptor { - name = entry.key - int64Value = entry.value.toLong() - argIndex = argumentIndex - } - },"index" to listOf(ArgDescriptor { - name = "index" - int64Value = lookupIndex.toLong() - }))) -} - - - - -class TensorflowListNumberToListNumber(mappingNamesToPerform: Map, transformerArgs: Map>) : ListNumberToListNumber(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun listNumberToListNumber(outputAttributeValue: String, inputAttributeValue: String): TensorflowListNumberToListNumber { - return TensorflowListNumberToListNumber(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - -class TensorflowStringAttributeToNDArray(mappingNamesToPerform: Map, transformerArgs: Map>) : StringAttributeToNDArray(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun convertStringToInputNDArray(mappings: Map): TensorflowStringAttributeToNDArray { - return TensorflowStringAttributeToNDArray(mappingNamesToPerform = mappings,transformerArgs = emptyMap()) -} - - - - - - - - -class TensorflowAttributeNumberListNDArray(mappingNamesToPerform: Map, transformerArgs: Map>) : AttributeNumberListNDArray(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun convertNumberListToInputNDArray(outputAttributeValue: String, inputAttributeValue: String): TensorflowAttributeNumberListNDArray { - return TensorflowAttributeNumberListNDArray(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue),transformerArgs = emptyMap()) -} - - -class TensorflowListAttributeValueLookupToIndex(mappingNamesToPerform: Map, transformerArgs: Map>) : ListAttributeValueLookupToIndex(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun listAttributeValueLookupToIndex(outputAttributeValue: String, inputAttributeValue: String, idx: Int,argumentIndex: Int): TensorflowListAttributeValueLookupToIndex { - return TensorflowListAttributeValueLookupToIndex(mappingNamesToPerform = mapOf(outputAttributeValue to inputAttributeValue), - transformerArgs = mapOf(outputAttributeValue to listOf(ArgDescriptor { - argType = OpNamespace.ArgDescriptor.ArgType.INT64 - int64Value = idx.toLong() - name = "index" - argIndex = argumentIndex - }))) -} - - - - - -class TensorflowDataTypeToInt(mappingNamesToPerform: Map, transformerArgs: Map>) : - DataTypeToInt(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun dataTypeToInt(mutableMap: MutableMap): TensorflowDataTypeToInt { - return TensorflowDataTypeToInt(mappingNamesToPerform = mutableMap,transformerArgs = emptyMap()) -} - - - - -class TensorflowNDArrayInputToNumericalAttribute(mappingNamesToPerform: Map, transformerArgs: Map>) : - NDArrayInputToNumericalAttribute(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun convertNDArrayInputToNumericalAttr(mutableMap: MutableMap): TensorflowNDArrayInputToNumericalAttribute { - return TensorflowNDArrayInputToNumericalAttribute(mappingNamesToPerform = mutableMap,transformerArgs = emptyMap()) -} - -class TensorflowListNumberToNDArray(mappingNamesToPerform: Map, transformerArgs: Map>) : - ListNumberToNDArray(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun listNumberToNDarray(mutableMap: MutableMap): TensorflowListNumberToNDArray { - return TensorflowListNumberToNDArray(mappingNamesToPerform = mutableMap,transformerArgs = emptyMap()) -} - - -class TensorflowNDArrayAttributeToNDArrayInput(mappingNamesToPerform: Map, transformerArgs: Map>) : - NDArrayAttributeToNDArrayInput(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun ndArrayAttributeToNDarrayInput(mutableMap: MutableMap): TensorflowNDArrayAttributeToNDArrayInput { - return TensorflowNDArrayAttributeToNDArrayInput(mappingNamesToPerform = mutableMap,transformerArgs = emptyMap()) -} - - -class TensorflowArgDescriptorConstant(mappingNamesToPerform: Map, transformerArgs: Map>) - : ArgDescriptorConstant(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun argDescriptorConstant(argDescriptorConstants: List): TensorflowArgDescriptorConstant { - return TensorflowArgDescriptorConstant(mappingNamesToPerform = emptyMap(),transformerArgs = mapOf("value" to argDescriptorConstants)) -} - - -class TensorflowAttributeNDArrayToScalarAttribute(mappingNamesToPerform: Map, transformerArgs: Map>) - : AttributeNDArrayToScalarAttribute(mappingNamesToPerform, transformerArgs) { - - override fun createIRAttribute(name: String, attrDef: OpDef.AttrDef, attributeValueType: AttrValue): IRAttribute { - return TensorflowIRAttr(inputAttributeValue = attributeValueType,inputAttributeDef = attrDef) - } - - override fun convertAttributesReverse(allInputArguments: List, inputArgumentsToProcess: List): List> { - TODO("Not yet implemented") - } - override fun isInputFrameworkTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowTensorName(name,opDef) - } - - override fun isNd4jTensorName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isNd4jTensorName(name,nd4jOpDescriptor) - } - - override fun isInputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return isTensorflowAttributeName(name,opDef) - } - - override fun isOutputFrameworkAttributeName(name: String, mappingProcess: MappingProcess): Boolean { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return isOutputFrameworkAttributeName(name,nd4jOpDescriptor) - } - - override fun argDescriptorType(name: String, mappingProcess: MappingProcess): OpNamespace.ArgDescriptor.ArgType { - val nd4jOpDescriptor = nd4jOpDescriptors.findOp(mappingProcess.opName()) - return argDescriptorType(name,nd4jOpDescriptor) - } - - override fun attributeValueTypeFor(name: String, mappingProcess: MappingProcess): AttributeValueType { - val opDef = tensorflowOps.findOp(mappingProcess.inputFrameworkOpName()) - return tensorflowAttributeValueTypeFor(attributeName = name,opDef = opDef) - } -} - -fun ndarrayAttributeToScalarAttribute(argDescriptorConstants: List): TensorflowAttributeNDArrayToScalarAttribute { - return TensorflowAttributeNDArrayToScalarAttribute(mappingNamesToPerform = emptyMap(),transformerArgs = mapOf("value" to argDescriptorConstants)) -} \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/GenerateOps.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/GenerateOps.kt index cfb123be7..e57c1f879 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/GenerateOps.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/GenerateOps.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.util import org.nd4j.codegen.impl.java.JavaPoetGenerator diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/extract/ExtractFromExisting.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/extract/ExtractFromExisting.kt index b883fc934..41f9b87fd 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/extract/ExtractFromExisting.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/extract/ExtractFromExisting.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.util.extract import java.io.File diff --git a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/extract/FindUsedParameterTypes.kt b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/extract/FindUsedParameterTypes.kt index afc7a18d9..d388cd3e4 100644 --- a/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/extract/FindUsedParameterTypes.kt +++ b/contrib/codegen-tools/codegen/src/main/kotlin/org/nd4j/codegen/util/extract/FindUsedParameterTypes.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.util.extract import java.io.File diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/mixins/Mixins.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/mixins/Mixins.kt index 031ee4993..880b7558b 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/mixins/Mixins.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/mixins/Mixins.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.mixins import org.nd4j.codegen.api.AtLeast diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Bitwise.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Bitwise.kt index 536687fe7..4ae35ab97 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Bitwise.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Bitwise.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ops import org.nd4j.codegen.api.DataType.INT diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/CNN.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/CNN.kt index 6c968b49e..1684fd051 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/CNN.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/CNN.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ops import org.nd4j.codegen.api.AtLeast diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Image.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Image.kt index a682d67b8..78d21cc3f 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Image.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Image.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ops import org.nd4j.codegen.api.AtLeast diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Linalg.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Linalg.kt index 1ed0f8dcc..c4b9c0a0a 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Linalg.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Linalg.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ops import org.nd4j.codegen.api.DataType diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Math.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Math.kt index 79b188a76..a15c01554 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Math.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Math.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + /** * Generated using ExtractFromExisting.kt */ diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/NeuralNetwork.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/NeuralNetwork.kt index 0f974682f..faf6a47be 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/NeuralNetwork.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/NeuralNetwork.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ops import org.nd4j.codegen.api.AtLeast diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/RNN.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/RNN.kt index 7559965ea..aff02b9ab 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/RNN.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/RNN.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ops import org.nd4j.codegen.api.AtLeast diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Random.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Random.kt index 21246acfa..b6777f19f 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Random.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/Random.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ops import org.nd4j.codegen.api.AtLeast diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/SDBaseOps.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/SDBaseOps.kt index e7fb0caba..28a4d6887 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/SDBaseOps.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/SDBaseOps.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + /** * Generated using ExtractFromExisting.kt */ diff --git a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/SDLoss.kt b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/SDLoss.kt index 9884dd3e9..d8706bd02 100644 --- a/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/SDLoss.kt +++ b/contrib/codegen-tools/codegen/src/main/ops/org/nd4j/codegen/ops/SDLoss.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + /** * Generated using ExtractFromExisting.kt */ diff --git a/contrib/codegen-tools/codegen/src/main/resources/java/utilClasses/NDValidation.java b/contrib/codegen-tools/codegen/src/main/resources/java/utilClasses/NDValidation.java deleted file mode 100644 index 1493e61bc..000000000 --- a/contrib/codegen-tools/codegen/src/main/resources/java/utilClasses/NDValidation.java +++ /dev/null @@ -1,122 +0,0 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * Copyright (c) 2019 Konduit, KK. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ - -package org.nd4j.linalg.api.ops.experimental; - -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.api.buffer.DataType; - -import java.util.Arrays; - -public class NDValidation { - - private NDValidation() { - } - - /** - * Validate that the operation is being applied on a numerical INDArray (not boolean or utf8). - * Some operations (such as sum, norm2, add(Number) etc don't make sense when applied to boolean/utf8 arrays - * - * @param opName Operation name to print in the exception - * @param v Variable to perform operation on - */ - protected static void validateNumerical(String opName, INDArray v, String inputName) { - if (v == null) - return; - if (v.dataType() == DataType.BOOL || v.dataType() == DataType.UTF8) - throw new IllegalStateException("Cannot apply operation \"" + opName + "\" to input \"" + inputName + "\" with non-numerical data type " + v.dataType()); - } - - - /** - * Validate that the operation is being applied on an integer type INDArray - * - * @param opName Operation name to print in the exception - * @param v Variable to validate datatype for (input to operation) - */ - protected static void validateInteger(String opName, INDArray v, String inputName) { - if (v == null) - return; - if (!v.dataType().isIntType()) - throw new IllegalStateException("Cannot apply operation \"" + opName + "\" to input \"" + inputName + "\" with non-integer data type " + v.dataType()); - } - - - /** - * Validate that the operation is being applied on an floating point type INDArray - * - * @param opName Operation name to print in the exception - * @param v Variable to validate datatype for (input to operation) - */ - protected static void validateFloatingPoint(String opName, INDArray v, String inputName) { - if (v == null) - return; - if (!v.dataType().isFPType()) - throw new IllegalStateException("Cannot apply operation \"" + opName + "\" to input \"" + inputName + "\" with non-floating point data type " + v.dataType()); - } - - - /** - * Validate that the operation is being applied on a boolean type INDArray - * - * @param opName Operation name to print in the exception - * @param v Variable to validate datatype for (input to operation) - */ - protected static void validateBool(String opName, INDArray v, String inputName) { - if (v == null) - return; - if (v.dataType() != DataType.BOOL) - throw new IllegalStateException("Cannot apply operation \"" + opName + "\" to inputName \"" + inputName + "\" with non-boolean point data type " + v.dataType()); - } - - /** - * Validate that the operation is being applied on array with the exact same datatypes - * - * @param opName Operation name to print in the exception - * @param vars Variable to perform operation on - */ - protected static void validateSameType(String opName, INDArray... vars) { - if (isSameType(vars)){ - return; - } - else{ - DataType[] dtypes = new DataType[vars.length]; - for (int j = 0; j < vars.length; j++) { - dtypes[j] = vars[j].dataType(); - } - throw new IllegalStateException("Cannot perform operation \"" + opName + "\" to inputs with different datatypes:" + - " datatypes " + Arrays.toString(dtypes)); - } - } - - /** - * Is the operation being applied on array with the exact same datatypes? - * - * @param vars Variable to perform operation on - */ - protected static boolean isSameType(INDArray... vars) { - if (vars.length > 1) { - DataType first = vars[0].dataType(); - for (int i = 1; i < vars.length; i++) { - if (first != vars[i].dataType()) { - return false; - } - } - } - return true; - } -} diff --git a/contrib/codegen-tools/codegen/src/test/java/org/nd4j/codegen/dsl/DocsGeneratorTest.java b/contrib/codegen-tools/codegen/src/test/java/org/nd4j/codegen/dsl/DocsGeneratorTest.java index 802c71348..5d8e12885 100644 --- a/contrib/codegen-tools/codegen/src/test/java/org/nd4j/codegen/dsl/DocsGeneratorTest.java +++ b/contrib/codegen-tools/codegen/src/test/java/org/nd4j/codegen/dsl/DocsGeneratorTest.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.dsl; import org.apache.commons.lang3.StringUtils; diff --git a/contrib/codegen-tools/codegen/src/test/java/org/nd4j/codegen/dsl/TestGeneration.java b/contrib/codegen-tools/codegen/src/test/java/org/nd4j/codegen/dsl/TestGeneration.java index 41d76de16..6c0ad1d25 100644 --- a/contrib/codegen-tools/codegen/src/test/java/org/nd4j/codegen/dsl/TestGeneration.java +++ b/contrib/codegen-tools/codegen/src/test/java/org/nd4j/codegen/dsl/TestGeneration.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.dsl; import org.apache.commons.io.FileUtils; diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/ConfigTest.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/ConfigTest.kt index 474e9f42b..8b5b80798 100644 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/ConfigTest.kt +++ b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/ConfigTest.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.dsl import org.junit.jupiter.api.Test diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/ConstraintTest.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/ConstraintTest.kt index 0abde7720..8660cce8b 100644 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/ConstraintTest.kt +++ b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/ConstraintTest.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.dsl import org.junit.jupiter.api.Test diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/NamespaceInvariantTest.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/NamespaceInvariantTest.kt index 1fdd88717..f2eab81f5 100644 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/NamespaceInvariantTest.kt +++ b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/NamespaceInvariantTest.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.dsl import org.junit.jupiter.api.Assertions.assertEquals diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/OpBuilderTest.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/OpBuilderTest.kt index 111fe875c..ff9776371 100644 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/OpBuilderTest.kt +++ b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/OpBuilderTest.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.dsl import org.apache.commons.io.FileUtils diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/OpInvariantTest.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/OpInvariantTest.kt index ab7882bef..396871842 100644 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/OpInvariantTest.kt +++ b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/dsl/OpInvariantTest.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.dsl import org.junit.jupiter.api.Assertions.assertSame diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/TestIR.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/TestIR.kt deleted file mode 100644 index f38b862da..000000000 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/TestIR.kt +++ /dev/null @@ -1,22 +0,0 @@ -package org.nd4j.codegen.ir - -import com.google.common.reflect.TypeToken -import org.junit.jupiter.api.Test -import kotlin.test.assertTrue -import org.apache.commons.lang3.reflect.TypeUtils -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.tensorflow.framework.* - - -class TestIR { - @Test - fun testLoadOpDescriptors() { - val outputType = TypeUtils.parameterize(OpMappingRegistry::class.java,GraphDef::class.java, NodeDef::class.java, OpDef::class.java, - TensorProto::class.java,DataType::class.java, OpDef.AttrDef::class.java,AttrValue::class.java) - val rawType = TypeToken.of(outputType).rawType - println(rawType) - val createdRegistry = rawType.getConstructor(String::class.java).newInstance("tensorflow") - println(createdRegistry) - } -} - diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/onnx/TestOnnxIR.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/onnx/TestOnnxIR.kt deleted file mode 100644 index a001b397b..000000000 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/onnx/TestOnnxIR.kt +++ /dev/null @@ -1,1133 +0,0 @@ -package org.nd4j.codegen.ir.onnx - -import junit.framework.Assert -import junit.framework.Assert.* -import onnx.Onnx -import org.junit.jupiter.api.Test -import org.nd4j.codegen.ir.ImportGraph -import org.nd4j.codegen.ir.registry.OpRegistryHolder -import org.nd4j.ir.OpNamespace -import org.nd4j.linalg.api.buffer.DataType -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.shade.protobuf.ByteString -import java.nio.charset.Charset -import kotlin.test.assertTrue - -data class OnnxGraphInput(val graphDef: Onnx.GraphProto, val inputNames: List, val outputNames: List, - val inputArrays: Map, val dynamicArrays: Map) - - -class TestOnnxIR { - val declarations = OnnxOpDeclarations - - - - @Test - fun testInputOutputNames() { - val onnxOpNames = onnxOpRegistry.inputFrameworkOpNames() - val nd4jOpNames = onnxOpRegistry.nd4jOpNames() - onnxOpRegistry.mappingProcessNames().map { - onnxOpRegistry.lookupOpMappingProcess(it) - }.forEach { - println("Beginning processing of op ${it.inputFrameworkOpName()} and nd4j op ${it.opName()}") - assertTrue(onnxOpNames.contains(it.inputFrameworkOpName())) - assertTrue(nd4jOpNames.contains(it.opName())) - val nd4jOpDef = onnxOpRegistry.lookupNd4jOpDef(it.opName()) - val onnxOpDef = onnxOpRegistry.lookupInputFrameworkOpDef(it.inputFrameworkOpName()) - val inputNameArgDefs = nd4jOpDef.argDescriptorList.filter { - argDef -> argDef.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR - }.map { argDef -> argDef.name } - - val inputFrameworkOpDefNames = onnxOpDef.inputList - - val nd4jArgDefNames = nd4jOpDef.argDescriptorList.map { nd4jArgDef -> nd4jArgDef.name } - val onnxAttrNames = onnxOpDef.attributeList.map { onnxAttr -> onnxAttr.name } - it.tensorMappingRules().forEach { tensorRules -> - println("Running tensor mapping rule ${tensorRules.name()} for op ${it.inputFrameworkOpName()} and nd4j op name ${it.opName()}") - run { - tensorRules.mappingNamesToPerform().forEach { tensorRule -> - run { - println("Testing assertion for nd4j name ${tensorRule.key} and input name ${tensorRule.value}") - assertTrue(inputNameArgDefs.contains(tensorRule.key)) ?: error("Failed on inputArgName ${tensorRule.key}") - assertTrue(inputFrameworkOpDefNames.contains(tensorRule.value)) ?: error("Failed on inputArgName ${tensorRule.value}") - } - - } - } - - } - - println("Running attribute mapping rules for ${it.opName()} and input op name ${it.inputFrameworkOpName()}") - it.attributeMappingRules().forEach { attrRule -> - run { - attrRule.mappingNamesToPerform().forEach { attrMapping -> - run { - println("Testing nd4j name ${attrMapping.key} and input framework name ${attrMapping.value}") - assertTrue(nd4jArgDefNames.contains(attrMapping.key) || inputNameArgDefs.contains(attrMapping.key)) - assertTrue(onnxAttrNames.contains(attrMapping.value) || inputFrameworkOpDefNames.contains(attrMapping.value)) - - } - - } - } - } - - } - } - - - @Test - fun testOpOrdering() { - val onnxOpNames = onnxOpRegistry.inputFrameworkOpNames() - //TODO: list ops need to work and TopK has a data type conversion issue with the k ndarray input - val bannedOps = setOf("Constant","Squeeze","ArgMax","Split", - "ReduceLogSumExp","AveragePool","TopK","RandomUniform") - val importGraph = ImportGraph() - - onnxOpNames.forEach { opName -> - if(onnxOpRegistry.hasMappingOpProcess(opName)) { - val opDef = onnxOpRegistry.lookupInputFrameworkOpDef(opName) - println("Processing op name $opName") - - val nodeBuilder = Onnx.NodeProto.newBuilder() - nodeBuilder.name = opName - val graphBuilder = Onnx.GraphProto.newBuilder() - nodeBuilder.opType = opName - val attrNames = opDef.attributeList.map {attrDef -> attrDef.name } - - //convert to a default case + return graph in new method - opDef.inputList.forEach { inputArgDef -> - //val inputNumberAttr = inputArgDef.numberAttr - val numAttributeValue = 1 - val typeAttrName = "$inputArgDef-types" - val typeAttrValue = opDef.attributeList.filter { attributeProto -> attributeProto.name == typeAttrName } - for(i in 0 until numAttributeValue) { - val listOfFloats = mutableListOf() - val listOfInts = mutableListOf() - val listOfDoubles = mutableListOf() - val listOfBools = mutableListOf() - val listOfLongs = mutableListOf() - val listOfStrings = mutableListOf() - //the largest tensors we're likely to touch are 5d - for(i in 0 until (1 * 2 * 3 * 4 * 5 * 6)) { - listOfFloats.add(i.toFloat()) - listOfInts.add(i) - listOfDoubles.add(i.toDouble()) - listOfBools.add(true) - listOfLongs.add(i.toLong()) - listOfStrings.add("$i") - } - - val nodeName = if(i <= 0) inputArgDef else inputArgDef + "$i" - nodeBuilder.addInput(nodeName) - - when(typeAttrValue[0].stringsList[0].toStringUtf8()) { - "double" -> { - val onnxTensorProto = Onnx.TensorProto.newBuilder() - onnxTensorProto.name = nodeName - onnxTensorProto.dataType = Onnx.TensorProto.DataType.DOUBLE - onnxTensorProto.addAllDoubleData(listOfDoubles) - onnxTensorProto.addAllDims(listOf(1,2,3,4,5,6)) - graphBuilder.addInitializer(onnxTensorProto.build()) - val onnxNodeToAdd = Onnx.NodeProto.newBuilder() - onnxNodeToAdd.name = nodeName - onnxNodeToAdd.opType = "Constant" - val attrValue = Onnx.AttributeProto.newBuilder() - attrValue.name = "value" - attrValue.addTensors(onnxTensorProto.build()) - onnxNodeToAdd.addAttribute(attrValue.build()) - graphBuilder.addNode(onnxNodeToAdd) - } - - "bool" -> { - val onnxTensorProto = Onnx.TensorProto.newBuilder() - onnxTensorProto.name = nodeName - onnxTensorProto.dataType = Onnx.TensorProto.DataType.BOOL - onnxTensorProto.addAllInt32Data(listOfInts) - onnxTensorProto.addAllDims(listOf(1,2,3,4,5,6)) - graphBuilder.addInitializer(onnxTensorProto.build()) - - val onnxNodeToAdd = Onnx.NodeProto.newBuilder() - onnxNodeToAdd.name = nodeName - onnxNodeToAdd.opType = "Constant" - val attrValue = Onnx.AttributeProto.newBuilder() - attrValue.name = "value" - attrValue.addTensors(onnxTensorProto.build()) - onnxNodeToAdd.addAttribute(attrValue.build()) - graphBuilder.addNode(onnxNodeToAdd) - } - - "float" -> { - val onnxTensorProto = Onnx.TensorProto.newBuilder() - onnxTensorProto.name = nodeName - onnxTensorProto.dataType = Onnx.TensorProto.DataType.FLOAT - onnxTensorProto.addAllFloatData(listOfFloats) - onnxTensorProto.addAllDims(listOf(1,2,3,4,5,6)) - graphBuilder.addInitializer(onnxTensorProto.build()) - - val onnxNodeToAdd = Onnx.NodeProto.newBuilder() - onnxNodeToAdd.name = nodeName - onnxNodeToAdd.opType = "Constant" - val attrValue = Onnx.AttributeProto.newBuilder() - attrValue.name = "value" - attrValue.addTensors(onnxTensorProto.build()) - onnxNodeToAdd.addAttribute(attrValue.build()) - graphBuilder.addNode(onnxNodeToAdd) - } - - - "int16","uint16" -> { - val onnxTensorProto = Onnx.TensorProto.newBuilder() - onnxTensorProto.name = nodeName - onnxTensorProto.dataType = Onnx.TensorProto.DataType.INT16 - onnxTensorProto.addAllInt32Data(listOfInts) - onnxTensorProto.addAllDims(listOf(1,2,3,4,5,6)) - graphBuilder.addInitializer(onnxTensorProto.build()) - val onnxNodeToAdd = Onnx.NodeProto.newBuilder() - onnxNodeToAdd.name = nodeName - onnxNodeToAdd.opType = "Constant" - val attrValue = Onnx.AttributeProto.newBuilder() - attrValue.name = "value" - attrValue.addTensors(onnxTensorProto.build()) - onnxNodeToAdd.addAttribute(attrValue.build()) - graphBuilder.addNode(onnxNodeToAdd) - } - - "int32","uint32" -> { - val onnxTensorProto = Onnx.TensorProto.newBuilder() - onnxTensorProto.name = nodeName - onnxTensorProto.dataType = Onnx.TensorProto.DataType.INT32 - onnxTensorProto.addAllDims(listOf(1,2,3,4,5,6)) - onnxTensorProto.addAllInt32Data(listOfInts) - graphBuilder.addInitializer(onnxTensorProto.build()) - val onnxNodeToAdd = Onnx.NodeProto.newBuilder() - onnxNodeToAdd.name = nodeName - onnxNodeToAdd.opType = "Constant" - val attrValue = Onnx.AttributeProto.newBuilder() - attrValue.name = "value" - attrValue.addTensors(onnxTensorProto.build()) - onnxNodeToAdd.addAttribute(attrValue.build()) - graphBuilder.addNode(onnxNodeToAdd) - } - - "int64","uint64" -> { - val onnxTensorProto = Onnx.TensorProto.newBuilder() - onnxTensorProto.name = nodeName - onnxTensorProto.addAllDims(listOf(1,2,3,4,5,6)) - onnxTensorProto.dataType = Onnx.TensorProto.DataType.INT64 - onnxTensorProto.addAllInt64Data(listOfLongs) - graphBuilder.addInitializer(onnxTensorProto.build()) - val onnxNodeToAdd = Onnx.NodeProto.newBuilder() - onnxNodeToAdd.name = nodeName - onnxNodeToAdd.opType = "Constant" - val attrValue = Onnx.AttributeProto.newBuilder() - attrValue.name = "value" - attrValue.addTensors(onnxTensorProto.build()) - onnxNodeToAdd.addAttribute(attrValue.build()) - graphBuilder.addNode(onnxNodeToAdd) - } - - "string" -> { - val onnxTensorProto = Onnx.TensorProto.newBuilder() - onnxTensorProto.name = nodeName - onnxTensorProto.dataType = Onnx.TensorProto.DataType.STRING - onnxTensorProto.addAllDims(listOf(1,2,3,4,5,6)) - onnxTensorProto.addAllStringData(listOfStrings.map { input -> ByteString.copyFrom(input.toByteArray( - Charset.defaultCharset())) }) - graphBuilder.addInitializer(onnxTensorProto.build()) - val onnxNodeToAdd = Onnx.NodeProto.newBuilder() - onnxNodeToAdd.name = nodeName - onnxNodeToAdd.opType = "Constant" - val attrValue = Onnx.AttributeProto.newBuilder() - attrValue.name = "value" - attrValue.addTensors(onnxTensorProto.build()) - onnxNodeToAdd.addAttribute(attrValue.build()) - graphBuilder.addNode(onnxNodeToAdd) - } - } - } - - } - - - opDef.attributeList.forEach { attr -> - when(attr.type) { - Onnx.AttributeProto.AttributeType.INTS -> { - //replace empty value with default ints for convolutions - val attrBuilder = Onnx.AttributeProto.newBuilder() - attrBuilder.addAllInts(listOf(1,1,1,1)) - attrBuilder.name = attr.name - nodeBuilder.addAttribute(attrBuilder.build()) - } - - Onnx.AttributeProto.AttributeType.FLOATS -> { - //replace empty value with default ints for convolutions - val attrBuilder = Onnx.AttributeProto.newBuilder() - attrBuilder.addAllFloats(listOf(1.0f,1.0f,1.0f,1.0f)) - attrBuilder.name = attr.name - nodeBuilder.addAttribute(attrBuilder.build()) - } - - - Onnx.AttributeProto.AttributeType.STRINGS -> { - //replace empty value with default ints for convolutions - val attrBuilder = Onnx.AttributeProto.newBuilder() - if(opName != "LSTM") - attrBuilder.addAllStrings(listOf("1","2","3","4").map { input -> ByteString.copyFrom(input.toByteArray( - Charset.defaultCharset())) - }) - else { - attrBuilder.addAllStrings(listOf("Relu","Tanh","Sigmoid","Relu").map { input -> ByteString.copyFrom(input.toByteArray( - Charset.defaultCharset())) - }) - } - attrBuilder.name = attr.name - nodeBuilder.addAttribute(attrBuilder.build()) - } - - Onnx.AttributeProto.AttributeType.TENSOR -> { - val attrBuilder = Onnx.AttributeProto.newBuilder() - attrBuilder.t = Onnx.TensorProto.newBuilder() - .addAllDims(listOf(1,1)).setDataType(Onnx.TensorProto.DataType.DOUBLE) - .addAllDoubleData(listOf(1.0)) - .build() - attrBuilder.name = attr.name - nodeBuilder.addAttribute(attrBuilder.build()) - } - - - - else -> { - nodeBuilder.addAttribute(attr) - } - } - - } - - - graphBuilder.addNode(nodeBuilder.build()) - val graph = graphBuilder.build() - - - - - if(!bannedOps.contains(opName)) { - val mappingProcess = onnxOpRegistry.lookupOpMappingProcess(opName) - val irGraph = OnnxIRGraph(graphDef = graph) - val mappingContext = OnnxMappingContext(opDef = opDef,node = nodeBuilder.build(),graph = irGraph,dynamicVariables = emptyMap()) - val mapResult = mappingProcess.applyProcess(mappingContext) - val groupedByArgType = mapResult.second.argDescriptorList.groupBy { keySelector -> keySelector.argType } - val sortedGroups = HashMap>() - groupedByArgType.forEach { (argType, argDescriptors) -> - sortedGroups[argType] = argDescriptors.sortedBy { argDescriptor -> argDescriptor.argIndex } - } - - //NOTE: Bitcast is in this list for examination outside of list offsets for assertions. We don't currently support data types for the test nodes. - sortedGroups.values.forEach { list -> run { - val namesEncountered = HashSet() - list.forEachIndexed { index, argDescriptor -> - //don't validate a name encountered more than once, this is probably an array - //note that we skip some ops here due to this assumption breaking for list types, we will test list types separately - if(!namesEncountered.contains(argDescriptor.name) - && !bannedOps.contains(opName)) { - assertEquals( - "Arg index $index for arg descriptor name ${argDescriptor.name} for nd4j op ${mappingContext.nd4jOpName()} when arg index was actually ${argDescriptor.argIndex}. Full arg descriptor was ${argDescriptor}.", - argDescriptor.argIndex, index - ) - namesEncountered.add(argDescriptor.name) - } - } - } - - val sameDiffResult = importGraph.importGraph(irGraph = irGraph,importOverride = null,opFilter = null,opMappingRegistry = OpRegistryHolder.onnx()) - println("Processed op name $opName") - - } - } - - - } - } - } - - - - @Test - fun testOpsMapped() { - val onnxOpNames = onnxOpRegistry.inputFrameworkOpNames().filter { onnxOpRegistry.registeredOps.containsKey(it) } - val nd4jOpNames = onnxOpRegistry.nd4jOpNames() - /** - * TODO: Assert each op is mapped. - * - * Assert all attributes in nd4j are mapped. - * If not, let's document what isn't and why for each op. - * - * Create an op generation tool that allows random generation of test cases - * based on existing mapped ops between nd4j and tensorflow. - */ - onnxOpNames.map { onnxOpName -> onnxOpRegistry.lookupOpMappingProcess(onnxOpName)} - .forEach { - val onnxNamesMapped = HashSet() - val nd4jNamesMapped = HashSet() - //we can ignore dtype for now - nd4jNamesMapped.add("dtype") - val opDef = onnxOpRegistry.lookupNd4jOpDef(it.opName()) - val onnxOpDef = onnxOpRegistry.lookupInputFrameworkOpDef(it.inputFrameworkOpName()) - val onnxAssertionNames = HashSet() - onnxAssertionNames.addAll(onnxOpDef.inputList.map { arg -> arg.toString() }) - onnxAssertionNames.addAll(onnxOpDef.attributeList.map { attr -> attr.name }) - val nd4jOpDefAssertions = HashSet() - nd4jOpDefAssertions.addAll(opDef.argDescriptorList.map { argDescriptor -> argDescriptor.name }) - val numRequiredInputs = onnxOpDef.inputCount - val nd4jInputs = opDef.argDescriptorList.filter { arg -> arg.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR }.count() - /** - * TODO: Grab total collection of mapped nd4j names - * as outputs and mapped tensorflow names as inputs. - * Compare the mapped names to the op definitions - * in nd4j and tensorflow respectively. - */ - it.tensorMappingRules().forEach { mappingRule -> - mappingRule.mappingNamesToPerform().forEach { mappingName -> - onnxNamesMapped.add(mappingName.value) - nd4jNamesMapped.add(mappingName.key) - } - } - - it.attributeMappingRules().forEach { mappingRule -> - mappingRule.mappingNamesToPerform().forEach { mappingName -> - onnxNamesMapped.add(mappingName.value) - nd4jNamesMapped.add(mappingName.key) - } - - mappingRule.mappingTransformerArgs().forEach {transformerArg -> - run { - transformerArg.value.forEach { argValue -> - nd4jNamesMapped.add(argValue.name) - - } - } - } - - } - - - onnxOpDef.inputList.forEach { inputName -> - Assert.assertTrue(onnxAssertionNames.contains(inputName)) - } - - onnxOpDef.attributeList.map { attrDef -> attrDef.name }.forEach { attrName -> - Assert.assertTrue(onnxAssertionNames.contains(attrName)) - } - - - - opDef.argDescriptorList.forEach { argDef -> - //only require it when the - - when(argDef.argType) { - OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR -> { - /** - * Nd4j typically has many optional inputs that can also double as attributes - * We need to allow for a bit of flexibility in how we handle op definitions. If they're not mapped 1 to 1, - * we just log a warning for unmapped inputs. Otherwise we can do an assertion. - */ - if(numRequiredInputs == nd4jInputs) - assertTrue("Nd4j op name ${opDef.name} with onnx mapping ${onnxOpDef.name} has missing mapping ${argDef.name}", nd4jNamesMapped.contains(argDef.name)) - else if(!nd4jNamesMapped.contains(argDef.name)) { - println("Warning: Nd4j op name ${opDef.name} with onnx mapping ${onnxOpDef.name} has missing mapping ${argDef.name}") - } - } - OpNamespace.ArgDescriptor.ArgType.INT32,OpNamespace.ArgDescriptor.ArgType.INT64 -> { - assertTrue("Nd4j op name ${opDef.name} with onnx mapping ${onnxOpDef.name} has missing mapping ${argDef.name}", nd4jNamesMapped.contains(argDef.name)) - } - OpNamespace.ArgDescriptor.ArgType.DOUBLE, OpNamespace.ArgDescriptor.ArgType.FLOAT -> { - assertTrue("Nd4j op name ${opDef.name} with onnx mapping ${onnxOpDef.name} has missing mapping ${argDef.name}", nd4jNamesMapped.contains(argDef.name)) - } - OpNamespace.ArgDescriptor.ArgType.BOOL -> { - assertTrue("Nd4j op name ${opDef.name} with onnx mapping ${onnxOpDef.name} has missing mapping ${argDef.name}", nd4jNamesMapped.contains(argDef.name)) - } - } - - } - - } - } - - @Test - fun testOpExecution() { - val scalarInputs = mapOf( - "abs" to -1.0, - "copy" to 1.0, - "erfc" to 1.0, - "exp" to 1.0, - "identity" to 1.0, - "neg" to 1.0, - "ones_as" to 1.0, - "relu6" to 1.0, - "round" to 1.0, - "sign" to 1.0, - "sin" to 1.0, - "square" to 1.0, - "sqrt" to 1.0) - - val scalarFloatOps = mapOf( - "acos" to 1.0f, - "asin" to 1.0f, - "acosh" to 1.0f, - "asinh" to 1.0f, - "atan" to 1.0f, - "atanh" to 0.5f, - "ceil" to 1.0f, - "cosh" to 1.0f, - "cos" to 1.0f, - "erf" to 1.0f, - "hard_sigmoid" to 1.0f, - "floor" to 1.0f, - "log" to 1.0f, - "round" to 1.0f, - "relu" to 1.0f, - "selu" to 1.0f, - "sinh" to 1.0f, - "sigmoid" to 1.0f, - "softplus" to 1.0f, - "softsign" to 1.0f, - "tan" to 1.0f, - "tanh" to 1.0f - ) - - - val singleInputOps = scalarInputs.keys - val singleInputBooleanOps = mapOf( - "not" to false - ) - - val singleOutputBooleanOps = mapOf( - "isfinite" to 1.0f, - "isinf" to 1.0f, - "isnan" to 1.0f, - ) - - val pairWiseBooleanOps = mapOf( - "min" to listOf(1.0,2.0), - "max" to listOf(1.0,2.0), - "equals" to listOf(2.0,2.0), - "greater" to listOf(2.0,1.0), - "greater_equal" to listOf(2.0,1.0), - "less" to listOf(2.0,1.0), - "less_equal" to listOf(2.0,1.0)) - - - val singleInputIntOutput = mapOf( - "size" to Nd4j.linspace(1,4,4).reshape(2,2), - "shape_of" to Nd4j.linspace(1,4,4).reshape(2,2) - ) - - val pairWiseBooleanInputs = mapOf( - "or" to listOf(true,false), - "and" to listOf(false,false), - "xor" to listOf(false,true) - ) - - - val singleReduceOps = mapOf( - "reduce_mean" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_max" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_sum" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_prod" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_norm1" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_norm2" to Nd4j.linspace(1,4,4).reshape(2,2) - // "reduce_logsumexp" to Nd4j.linspace(1,4,4).reshape(2,2) - ) - - - val pairwise = mapOf( - "add" to listOf(1.0,1.0), - "subtract" to listOf(2.0,1.0), - "multiply" to listOf(2.0,1.0), - "divide" to listOf(2.0,1.0), - "pow" to listOf(2.0,1.0) - ) - - val mappedOps = setOf("elu","transpose","argmin","argmax","leakyrelu","prelu","non_max_suppression_v3")//,"top_k") - - /** - * NOTE WHEN WRITING TESTS, IF YOU SEE AN ERROR like: - * java.lang.RuntimeException: Could not find an implementation for the node output:Cos(7) - * - * Check the supported data types for each op here: - * https://github.com/microsoft/onnxruntime/blob/master/docs/OperatorKernels.md - */ - - val importGraph = ImportGraph() - val finishedOps = HashSet() - onnxOpRegistry.mappingProcessNames() - .filter { onnxOpRegistry.hasMappingOpProcess(it) } - .map { onnxOpRegistry.lookupOpMappingProcess(it) }.forEach { mappingProcess -> - val nd4jOpDef = onnxOpRegistry.lookupNd4jOpDef(mappingProcess.opName()) - val onnxOpDef = onnxOpRegistry.lookupInputFrameworkOpDef(mappingProcess.inputFrameworkOpName()) - if(scalarInputs.containsKey(nd4jOpDef.name)) { - print("Running op $nd4jOpDef.name") - val input = Nd4j.scalar(scalarInputs[mappingProcess.opName()]).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(input,"input")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("input") - Output("output") - - }) - - Output(createValueInfoFromTensor(input,"output")) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("input"),listOf("output")) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null,emptyMap(),OpRegistryHolder.onnx()) - val inputs = mapOf("input" to input) - val assertion = onnxGraphRunner.run(inputs) - val result = importedGraph.output(inputs,"output") - assertEquals("Function ${nd4jOpDef.name} failed with input $input",assertion["output"]!!.reshape(1,1),result["output"]!!.reshape(1,1)) - finishedOps.add(nd4jOpDef.name) - - } else if(scalarFloatOps.containsKey(nd4jOpDef.name)) { - print("Running op $nd4jOpDef.name") - val input = Nd4j.scalar(scalarFloatOps[mappingProcess.opName()]).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(input,"input")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("input") - Output("output") - - }) - - Output(createValueInfoFromTensor(input,"output")) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("input"),listOf("output")) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null,emptyMap(),OpRegistryHolder.onnx()) - val inputs = mapOf("input" to input) - val assertion = onnxGraphRunner.run(inputs) - val result = importedGraph.output(inputs,"output") - assertEquals("Function ${nd4jOpDef.name} failed with input $input",assertion["output"]!!.reshape(1,1),result["output"]!!.reshape(1,1)) - finishedOps.add(nd4jOpDef.name) - - } - - else if(singleOutputBooleanOps.containsKey(nd4jOpDef.name)) { - print("Running op $nd4jOpDef.name") - val input = Nd4j.scalar(singleOutputBooleanOps[mappingProcess.opName()]).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val convertedTensor = convertToOnnxTensor(input,"input") - val convertedOutputTensor = convertToOnnxTensor(input,"output") - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(input,"input")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("input") - Output("output") - - }) - - Output(createValueInfoFromTensor(Nd4j.create(booleanArrayOf(true)).reshape(),"output")) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("input"),listOf("output")) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null,emptyMap(),OpRegistryHolder.onnx()) - val inputs = mapOf("input" to input) - val assertion = onnxGraphRunner.run(inputs) - val result = importedGraph.output(inputs,"output") - assertEquals("Function ${nd4jOpDef.name} failed with input $input",assertion["output"]!!.reshape(1,1),result["output"]!!.reshape(1,1)) - finishedOps.add(nd4jOpDef.name) - - } - - - else if(pairwise.containsKey(nd4jOpDef.name)) { - print("Running op def $nd4jOpDef.name") - val x = Nd4j.scalar(pairwise[mappingProcess.opName()]!![0]!!).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val y = Nd4j.scalar(pairwise[mappingProcess.opName()]!![1]!!).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(x,"x")) - Input(createValueInfoFromTensor(y,"y")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("x") - Input("y") - Output("output") - - }) - - Output(createValueInfoFromTensor(x,"output")) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("x","y"),listOf("output")) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null, - mapOf("x" to convertToOnnxTensor(x,"x"),"y" to convertToOnnxTensor(y,"y")),OpRegistryHolder.onnx()) - val inputs = mapOf("x" to x,"y" to y) - val result = importedGraph.output(inputs,"output") - val assertion = onnxGraphRunner.run(inputs) - assertEquals("Function ${nd4jOpDef.name} failed with input $x $y",assertion["output"]!!.getDouble(0),result["output"]!!.getDouble(0)) - finishedOps.add(nd4jOpDef.name) - - } else if(pairWiseBooleanInputs.containsKey(nd4jOpDef.name)) { - print("Running op def $nd4jOpDef.name") - val x = Nd4j.scalar(pairWiseBooleanInputs[mappingProcess.opName()]!![0]!!).castTo(org.nd4j.linalg.api.buffer.DataType.BOOL) - val y = Nd4j.scalar(pairWiseBooleanInputs[mappingProcess.opName()]!![1]!!).castTo(org.nd4j.linalg.api.buffer.DataType.BOOL) - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(x,"x")) - Input(createValueInfoFromTensor(y,"y")) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("x") - Input("y") - Output("output") - - }) - - Output(createValueInfoFromTensor(x,"output")) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("x","y"),listOf("output")) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null, - mapOf("x" to convertToOnnxTensor(x,"x"),"y" to convertToOnnxTensor(y,"y")),OpRegistryHolder.onnx()) - val inputs = mapOf("x" to x,"y" to y) - val assertion = onnxGraphRunner.run(inputs) - val result = importedGraph.output(inputs,"output") - assertEquals("Function ${nd4jOpDef.name} failed with input $x $y",assertion["output"]!!.getDouble(0),result["output"]!!.getDouble(0)) - finishedOps.add(nd4jOpDef.name) - - } else if(pairWiseBooleanOps.containsKey(nd4jOpDef.name)) { - print("Running op def $nd4jOpDef.name") - val x = Nd4j.scalar(pairWiseBooleanOps[mappingProcess.opName()]!![0]!!).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val y = Nd4j.scalar(pairWiseBooleanOps[mappingProcess.opName()]!![1]!!).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val output = Nd4j.scalar(pairWiseBooleanOps[mappingProcess.opName()]!![1]!!).castTo(org.nd4j.linalg.api.buffer.DataType.BOOL) - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(x,"x")) - Input(createValueInfoFromTensor(y,"y")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("x") - Input("y") - Output("output") - - }) - - Output(createValueInfoFromTensor(output,"output")) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("x","y"),listOf("output")) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null, - mapOf("x" to convertToOnnxTensor(x,"x"),"y" to convertToOnnxTensor(y,"y")),OpRegistryHolder.onnx()) - val inputs = mapOf("x" to x,"y" to y) - val assertion = onnxGraphRunner.run(inputs) - val result = importedGraph.output(inputs,"output") - assertEquals("Function ${nd4jOpDef.name} failed with input $x $y",assertion["output"]!!.getDouble(0),result["output"]!!.getDouble(0)) - finishedOps.add(nd4jOpDef.name) - - } - - else if(singleInputBooleanOps.containsKey(nd4jOpDef.name)) { - print("Running op def $nd4jOpDef.name") - val x = Nd4j.create(booleanArrayOf(singleInputBooleanOps[mappingProcess.opName()]!!)).castTo(org.nd4j.linalg.api.buffer.DataType.BOOL) - val output = Nd4j.create(booleanArrayOf(singleInputBooleanOps[mappingProcess.opName()]!!)).castTo(org.nd4j.linalg.api.buffer.DataType.BOOL) - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(x,"x")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("x") - Output("output") - - }) - - Output(createValueInfoFromTensor(output,"output")) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("x"),listOf("output")) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null,mapOf("x" to convertToOnnxTensor(x,"x")),OpRegistryHolder.onnx()) - val inputs = mapOf("x" to x) - val assertion = onnxGraphRunner.run(inputs) - val result = importedGraph.output(inputs,"output") - finishedOps.add(nd4jOpDef.name) - - //assertEquals("Function ${nd4jOpDef.name} failed with input $x",assertion["output"]!!.reshape(1,1),result["output"]!!.reshape(1,1)) - } - - else if(singleReduceOps.containsKey(nd4jOpDef.name)) { - print("Running op def $nd4jOpDef.name") - val x = singleReduceOps[mappingProcess.opName()]!!.castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val output = x.mean(0).reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(x,"x")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("x") - Output("output") - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INTS) - .setName("axes").addInts(0).build()) - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INT) - .setI(0) - .setName("keepdims").build()) - - }) - - Output(createValueInfoFromTensor(output,"output")) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val inputs = mapOf("x" to x) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null,mapOf("x" to convertToOnnxTensor(x,"x")),OpRegistryHolder.onnx()) - val result = importedGraph.output(inputs,"output") - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("x"),listOf("output")) - val assertion = onnxGraphRunner.run(inputs) - assertEquals("Function ${nd4jOpDef.name} failed with input $x",assertion["output"]!!.reshape(1,2),result["output"]!!.reshape(1,2)) - finishedOps.add(nd4jOpDef.name) - - } else if(mappedOps.contains(nd4jOpDef.name)){ - val graphForOp = graphForOp(nd4jOpDef.name) - graphForOp.forEach { graph -> - val onnxIRGraph = OnnxIRGraph(graph.graphDef) - val inputs =graph.inputArrays - val convertedArrays = HashMap() - graph.inputArrays.forEach { name, arr -> - convertedArrays[name] = convertToOnnxTensor(arr,name) - } - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null,convertedArrays,OpRegistryHolder.onnx()) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,graph.inputNames,graph.outputNames) - val assertion = onnxGraphRunner.run(inputs) - val result = importedGraph.output(inputs,graph.outputNames) - assertEquals(assertion.keys,result.keys) - result.forEach { name,arr -> - if(arr.length().toInt() == 1) { - assertEquals("Function ${nd4jOpDef.name} failed with input ${graph.inputNames}",assertion[name]!!.getDouble(0),arr.getDouble(0),1e-3) - } - else { - assertEquals("Function ${nd4jOpDef.name} failed with input ${graph.inputNames}",assertion[name],arr) - } - } - - finishedOps.add(nd4jOpDef.name) - - - } - - - } else if(singleInputIntOutput.containsKey(nd4jOpDef.name)) { - print("Running op $nd4jOpDef.name") - val input = singleInputIntOutput[mappingProcess.opName()]!!.castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(input,"input")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = onnxOpDef.opType - Input("input") - Output("output") - - }) - - Output(createValueInfoFromTensor(input,"output",false )) - } - - - val onnxIRGraph = OnnxIRGraph(graphToRun) - val onnxGraphRunner = OnnxIRGraphRunner(onnxIRGraph,listOf("input"),listOf("output")) - val importedGraph = importGraph.importGraph(onnxIRGraph,null,null,emptyMap(),OpRegistryHolder.onnx()) - val inputs = mapOf("input" to input) - val assertion = onnxGraphRunner.run(inputs) - val result = importedGraph.output(inputs,"output") - if(assertion["output"]!!.length() == 1L) - assertEquals("Function ${nd4jOpDef.name} failed with input $input",assertion["output"]!!.reshape(1,1),result["output"]!!.reshape(1,1)) - else - assertEquals("Function ${nd4jOpDef.name} failed with input $input",assertion["output"]!!.ravel(),result["output"]!!.ravel()) - finishedOps.add(nd4jOpDef.name) - - } - } - - println("Finished ops totaling ${finishedOps.size} out of ${onnxOpRegistry.mappedNd4jOpNames().size}") - } - - - - fun graphForOp(opName: String): List { - when(opName) { - "non_max_suppression_v3" -> { - /** - * TODO: Add pre and post processing for each node. - * Our NMS requires 2d, but onnx is 3d. Attempt to see - * if generalized pre/post processing node additions as part of a mapping process can work. - * - */ - print("Running op def $opName") - val boxesVal = Nd4j.create(arrayOf( - floatArrayOf(0f,0f,1f,1f), - floatArrayOf(0f,0.1f,1f,1.1f), - floatArrayOf(0f,-0.1f,1f,0.9f), - floatArrayOf(0f,10f,1f,11f) - )).reshape(1,4,4).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val scoresVal = Nd4j.create(listOf(0.9f,0.75f,0.6f,0.95f).toFloatArray()) - .reshape(1,1,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val maxOutputSize = Nd4j.scalar(4.0).castTo(DataType.INT64) - val iouThreshold = Nd4j.scalar(0.5).castTo(DataType.FLOAT) - val scoreThreshold = Nd4j.scalar(0.0).castTo(DataType.FLOAT) - - val inputs = mapOf("boxes" to boxesVal,"scores" to scoresVal,"max_output_boxes_per_class" to maxOutputSize, - "iou_threshold" to iouThreshold,"score_threshold" to scoreThreshold) - val output = Nd4j.scalar(1) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(boxesVal,"boxes",false)) - Input(createValueInfoFromTensor(scoresVal,"scores",false)) - Input(createValueInfoFromTensor(maxOutputSize,"max_output_boxes_per_class",false)) - Input(createValueInfoFromTensor(iouThreshold,"iou_threshold",false)) - Input(createValueInfoFromTensor(scoreThreshold,"score_threshold",false)) - - //Initializer(convertedTensor) - Node(NodeProto { - Input("boxes") - Input("scores") - Input("max_output_boxes_per_class") - Input("iou_threshold") - Input("score_threshold") - Output("output") - name = "output" - opType = "NonMaxSuppression" - - - - }) - - Output(createValueInfoFromTensor(output,"output",false)) - } - - return listOf(OnnxGraphInput(graphToRun,listOf("boxes","scores","max_output_boxes_per_class","iou_threshold","score_threshold"),listOf("output"),inputs,inputs)) - } - "argmin","argmax" -> { - print("Running op def $opName") - val x = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val output = x.mean(0).reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(x,"x")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = if(opName == "argmin") "ArgMin" else "ArgMax" - Input("x") - Output("output") - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INT) - .setName("axis").setI(0).build()) - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INT) - .setI(0) - .setName("keepdims").build()) - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INT) - .setI(1) - .setName("select_last_index").build()) - - }) - - Output(createValueInfoFromTensor(output,"output",false)) - } - - val inputMap = mapOf("x" to x) - return listOf(OnnxGraphInput(graphToRun,listOf("x"),listOf("output"),inputMap,inputMap)) - } - "top_k" -> { - val input = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val k = Nd4j.scalar(2.0).castTo(DataType.INT64).reshape(1) - val output = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(input,"input")) - Input(createValueInfoFromTensor(k,"k")) - Node(NodeProto { - name = "output" - opType = "TopK" - Input("input") - Input("k") - Output("output") - Output("indices") - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INT) - .setI(0) - .setName("axis").build()) - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INT) - .setI(1) - .setName("sorted").build()) - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INT) - .setI(1) - .setName("largest").build()) - - }) - - - Output(createValueInfoFromTensor(input,"output",false)) - Output(createValueInfoFromTensor(output,"indices",false)) - } - - val inputMap = mapOf("input" to input,"k" to k) - return listOf(OnnxGraphInput(graphToRun,listOf("input","k"),listOf("output","indices"),inputMap,inputMap)) - - } - "transpose" -> { - val input = Nd4j.linspace(1,6,6).reshape(3,2).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val output = Nd4j.linspace(1,6,6).reshape(2,3).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(input,"input")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = "Transpose" - Input("input") - Output("output") - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INTS) - .addInts(1).addInts(0) - .setName("perm").build()) - - }) - - Output(createValueInfoFromTensor(output,"output")) - } - - val inputMap = mapOf("input" to input) - return listOf(OnnxGraphInput(graphToRun,listOf("input"),listOf("output"),inputMap,inputMap)) - - } - "prelu" -> { - val input = Nd4j.randn(3,4,5).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val alpha = Nd4j.zeros(1,1,5).addi(0.1).castTo(DataType.FLOAT) - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(input,"input",false)) - Input(createValueInfoFromTensor(input,"slope",false)) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = "PRelu" - Input("input") - Input("slope") - Output("output") - - }) - - Output(createValueInfoFromTensor(input,"output",false)) - } - - val inputMap = mapOf("input" to input,"slope" to alpha) - return listOf(OnnxGraphInput(graphToRun,listOf("input","slope"),listOf("output"),inputMap,inputMap)) - - } - "elu","leakyrelu" -> { - val input = Nd4j.scalar(1.0f).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(input,"input")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = if(name == "elu") "Elu" else "LeakyRelu" - Input("input") - Output("output") - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.FLOAT) - .setF(1.0f) - .setName("alpha").build()) - - }) - - Output(createValueInfoFromTensor(input,"output")) - } - - val inputMap = mapOf("input" to input) - return listOf(OnnxGraphInput(graphToRun,listOf("input"),listOf("output"),inputMap,inputMap)) - - } - - "mod" -> { - val x = Nd4j.scalar(2.0).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val y = Nd4j.scalar(2.0).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val graphToRun = GraphProto { - Input(createValueInfoFromTensor(x,"x")) - Input(createValueInfoFromTensor(y,"y")) - //Initializer(convertedTensor) - Node(NodeProto { - name = "output" - opType = "Mod" - Input("x") - Input("y") - Output("output") - Attribute(Onnx.AttributeProto.newBuilder() - .setType(Onnx.AttributeProto.AttributeType.INT) - .setI(1) - .setName("fmod").build()) - }) - - Output(createValueInfoFromTensor(x,"output")) - } - - val inputMap = mapOf("x" to x,"y" to y) - return listOf(OnnxGraphInput(graphToRun,listOf("x","y"),listOf("output"),inputMap,inputMap)) - - - } - else -> { - throw IllegalArgumentException("Illegal op name $opName") - } - - } - } - -} \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/onnx/TestOnnxRuleDeclarations.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/onnx/TestOnnxRuleDeclarations.kt deleted file mode 100644 index 8ce422569..000000000 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/onnx/TestOnnxRuleDeclarations.kt +++ /dev/null @@ -1,478 +0,0 @@ -package org.nd4j.codegen.ir.onnx - -import org.junit.jupiter.api.Test -import org.nd4j.codegen.ir.ArgDescriptor -import org.nd4j.codegen.ir.onnx.attributeScalarToNDArrayInput -import org.nd4j.codegen.ir.onnx.conditionalFieldValueIntIndexArrayRule -import org.nd4j.codegen.ir.onnx.convertNDArrayInputToScalarAttr -import org.nd4j.ir.TensorNamespace -import org.nd4j.shade.protobuf.ByteString -import java.nio.charset.Charset -import kotlin.test.assertEquals -import kotlin.test.assertTrue - -class TestOnnxRuleDeclarations { - - /* @Test - fun testArgConstant() { - val opDef = onnxops.first { it.name == "Dilation2D" } - val intItems = listOf(2,1,1,1) - val valueNodeDef = NodeProto { - opType = "Dilation2D" - name = "inputs" - AttributeProto { - - } - } - - val shape = listOf(1,1).map { it.toLong() } - val valueNodeDef2 = NodeProto { - opType = "Constant" - name = "inputs" - AttributeProto { - - } - Attribute(name = "value",value = AttributeProto { - tensor = TensorProto { - Shape(shape) - DoubleData(listOf(1.0)) - } - }) - } - - - - val graphDef = GraphProto { - Node(valueNodeDef) - Node(valueNodeDef2) - } - - val tfGraph = OnnxIRGraph(graphDef) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - val convertNumberListToInputNDArrayRule = org.nd4j.codegen.ir.tensorflow.argDescriptorConstant(listOf(ArgDescriptor { - name = "value" - int32Value = 1 - })) - - val convertNumberListToInputNDArrayResult = convertNumberListToInputNDArrayRule.convertAttributes(mappingContext) - - assertEquals(1,convertNumberListToInputNDArrayResult.size) - assertEquals(1,convertNumberListToInputNDArrayResult[0].int32Value) - } - - - - @Test - fun testConvertNDArrayInputToScalarAttr() { - val opDef = onnxops.findOp("Dilation2D") - val intItems = listOf(2,1,1,1) - val valueNodeDef = NodeProto { - op = "Dilation2D" - name = "inputs" - Attribute(name = "strides",value = AttributeProto { - list = ListValue { - IntItems(intItems) - } - }) - } - - val shape = listOf(1,1).map { it.toLong() } - val valueNodeDef2 = NodeProto { - op = "Constant" - name = "inputs" - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(shape) - DoubleData(listOf(1.0)) - } - }) - } - - - - val graphDef = GraphProto { - Node(valueNodeDef) - Node(valueNodeDef2) - } - - val tfGraph = OnnxIRGraph(graphDef) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - val convertNumberListToInputNDArrayRule = convertNDArrayInputToScalarAttr(mutableMapOf("output" to "inputs ")) - val convertNumberListToInputNDArrayResult = convertNumberListToInputNDArrayRule.convertAttributes(mappingContext) - assertEquals(1,convertNumberListToInputNDArrayResult.size) - assertEquals(2,convertNumberListToInputNDArrayResult[0].int64Value) - } - - @Test - fun testListAttributeValueLookupToIndex() { - val opDef = onnxops.findOp("Dilation2D") - val intItems = listOf(2,1,1,1) - val valueNodeDef = NodeDef { - op = "Dilation2D" - name = "inputs" - Attribute(name = "strides",value = AttrValue { - list = ListValue { - IntItems(intItems) - } - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - } - - val tfGraph = OnnxIRGraph(graphDef, onnxops) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - val convertNumberListToInputNDArrayRule = listAttributeValueLookupToIndex(outputAttributeValue = "output", inputAttributeValue = "strides", idx = 0) - val convertNumberListToInputNDArrayResult = convertNumberListToInputNDArrayRule.convertAttributes(mappingContext) - assertEquals(1,convertNumberListToInputNDArrayResult.size) - assertEquals(2,convertNumberListToInputNDArrayResult[0].int64Value) - } - - - @Test - fun testConvertNumberListToInputNDArray() { - val opDef = onnxops.findOp("Dilation2D") - val intItems = listOf(1,1,1,1) - val valueNodeDef = NodeProto { - op = "Dilation2D" - name = "inputs" - Attribute(name = "strides",value = AttrValue { - list = ListValue { - IntItems(intItems) - } - }) - } - - - val graphDef = GraphProto { - Node(valueNodeDef) - } - - val tfGraph = OnnxIRGraph(graphDef) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - val convertNumberListToInputNDArrayRule = convertNumberListToInputNDArray(outputAttributeValue = "output", inputAttributeValue = "strides") - val convertNumberListToInputNDArrayResult = convertNumberListToInputNDArrayRule.convertAttributes(mappingContext) - assertEquals(1,convertNumberListToInputNDArrayResult.size) - val inputVal = convertNumberListToInputNDArrayResult[0].inputValue - assertEquals(2,inputVal.dimsCount) - val testList = inputVal.int64DataList - testList.forEach { - assertEquals(1,it) - } - } - - @Test - fun testValueMapping() { - val opDef = onnxops.findOp("CudnnRNN") - val valueNodeDef = NodeProto { - op = "CudnnRNN" - name = "inputs" - Attribute(name = "is_training",value = AttrValue { - b = true - }) - Attribute(name = "seed",value = AttrValue { - i = 1 - }) - Attribute(name = "dropout",value = AttrValue { - f = 1.0f - }) - Attribute(name = "direction",value = AttrValue { - s = ByteString.copyFrom("unidirectional".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphProto { - Node(valueNodeDef) - } - - val tfGraph = OnnxIRGraph(graphDef) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - val booleanToInt = valueMapping(mapOf("output" to "is_training","output2" to "seed","output3" to "dropout","output4" to "direction")) - val booleanToIntResult = booleanToInt.convertAttributes(mappingContext) - assertEquals(4,booleanToIntResult.size) - val boolValue = booleanToIntResult.first { it.name == "output" }.boolValue - val intValue = booleanToIntResult.first {it.name == "output2" }.int64Value - val floatValue = booleanToIntResult.first {it.name == "output3"}.floatValue - val stringVal = booleanToIntResult.first {it.name == "output4" }.stringValue - assertEquals(true,boolValue) - assertEquals(1,intValue) - assertEquals(1.0f,floatValue) - assertEquals("unidirectional",stringVal) - } - - @Test - fun testBooleanToInt() { - val opDef = onnxops.findOp("CudnnRNN") - val valueNodeDef = NodeProto { - op = "CudnnRNN" - name = "inputs" - Attribute(name = "is_training",value = AttrValue { - b = true - }) - } - - - val graphDef = GraphProto { - Node(valueNodeDef) - } - - val tfGraph = OnnxIRGraph(graphDef, onnxops) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - val booleanToInt = org.nd4j.codegen.ir.tensorflow.booleanToInt(mapOf("output" to "is_training")) - val booleanToIntResult = booleanToInt.convertAttributes(mappingContext) - assertEquals(1,booleanToIntResult.size) - val boolValue = booleanToIntResult[0].int64Value - assertEquals(1,boolValue) - } - - @Test - fun testAttributeScalarToNDArrayInputRuleDouble() { - val opDef = onnxops.findOp("CudnnRNN") - val valueNodeDef = NodeProto { - op = "CudnnRNN" - name = "inputs" - Attribute(name = "dropout",value = AttrValue { - f = 1.0f - }) - } - - - val graphDef = GraphProto { - Node(valueNodeDef) - } - - val tfGraph = OnnxIRGraph(graphDef) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - val ndarrScalarRule = attributeScalarToNDArrayInput(outputAttribute = "output",inputFrameworkAttributeName = "dropout") - val ndarrScalarRuleResult = ndarrScalarRule.convertAttributes(mappingContext) - assertEquals(1,ndarrScalarRuleResult.size) - assertTrue {ndarrScalarRuleResult[0].hasInputValue()} - val tensorValue = ndarrScalarRuleResult[0].inputValue - assertEquals(2,tensorValue.dimsCount) - assertEquals(TensorNamespace.DataType.FLOAT.ordinal,tensorValue.dataType) - val floatValue = tensorValue.floatDataList[0] - assertEquals(1.0f,floatValue) - } - - @Test - fun testAttributeScalarToNDArrayInputRuleInt() { - val opDef = onnxops.findOp("CountUpTo") - val valueNodeDef = NodeProto { - op = "CountUpTo" - name = "inputs" - Attribute(name = "limit",value = AttrValue { - i = 1 - }) - } - - - val graphDef = GraphProto { - Node(valueNodeDef) - } - - val tfGraph = OnnxIRGraph(graphDef) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - val ndarrScalarRule = attributeScalarToNDArrayInput(outputAttribute = "output",inputFrameworkAttributeName = "limit") - val ndarrScalarRuleResult = ndarrScalarRule.convertAttributes(mappingContext) - assertEquals(1,ndarrScalarRuleResult.size) - assertTrue {ndarrScalarRuleResult[0].hasInputValue()} - val tensorValue = ndarrScalarRuleResult[0].inputValue - assertEquals(2,tensorValue.dimsCount) - assertEquals(TensorNamespace.DataType.INT64.ordinal,tensorValue.dataType) - val intValue = tensorValue.int64DataList[0] - assertEquals(1,intValue) - } - - @Test - fun testStringNotEqualsRule() { - val opDef = onnxops.findOp("Const") - val valueNodeDef = NodeProto { - op = "Const" - name = "inputs" - Attribute(name = "value",value = AttrValue { - s = ByteString.copyFrom("value".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphProto { - Node(valueNodeDef) - } - - val tfGraph = OnnxIRGraph(graphDef) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - listOf("value","notValue").zip(listOf(false,true)).forEach { (valueToTest,assertionResult) -> - val stringNotEqualsRule = org.nd4j.codegen.ir.tensorflow.stringNotEqualsRule(outputAttribute = "output", inputFrameworkAttributeName = "value", valueToTest = valueToTest) - val stringEqualsResult = stringNotEqualsRule.convertAttributes(mappingCtx = mappingContext) - assertEquals(1,stringEqualsResult.size) - assertEquals(assertionResult,stringEqualsResult[0].boolValue) - - } - - - } - - - @Test - fun testStringContainsRule() { - val opDef = onnxops.findOp("Const") - val valueNodeDef = NodeProto { - op = "Const" - name = "inputs" - Attribute(name = "value",value = AttrValue { - s = ByteString.copyFrom("value".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphProto { - Node(valueNodeDef) - - } - - val tfGraph = OnnxIRGraph(graphDef) - val mappingContext = OnnxMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - listOf("value","notValue").zip(listOf(true,false)).forEach { (valueToTest,assertionResult) -> - val stringContainsRule = org.nd4j.codegen.ir.tensorflow.stringContainsRule(outputAttribute = "output", inputFrameworkAttributeName = "value", valueToTest = valueToTest) - val stringEqualsResult = stringContainsRule.convertAttributes(mappingCtx = mappingContext) - assertEquals(1,stringEqualsResult.size) - assertEquals(assertionResult,stringEqualsResult[0].boolValue) - - } - - - } - - - @Test - fun testStringEqualsRule() { - val opDef = onnxops.findOp("Const") - val valueNodeDef = NodeDef { - op = "Const" - name = "inputs" - Attribute(name = "value",value = AttrValue { - s = ByteString.copyFrom("value".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - - } - - val tfGraph = TensorflowIRGraph(graphDef, onnxops) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph) - listOf("value","notValue").zip(listOf(true,false)).forEach { (valueToTest,assertionResult) -> - val stringEqualsRule = org.nd4j.codegen.ir.tensorflow.stringEqualsRule(outputAttribute = "output", inputFrameworkAttributeName = "value", valueToTest = valueToTest) - val stringEqualsResult = stringEqualsRule.convertAttributes(mappingCtx = mappingContext) - assertEquals(1,stringEqualsResult.size) - assertEquals(assertionResult,stringEqualsResult[0].boolValue) - - } - - - } - - - @Test - fun testNDArraySizeAtRule() { - val opDef = onnxops.findOp("AddN") - val nodeDef = NodeDef { - op = "AddN" - Input("inputs") - Input("y") - name = "test" - } - - val shape = listOf(1,2).map { it.toLong() } - - val valueNodeDef = NodeDef { - op = "Constant" - name = "inputs" - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(shape) - DoubleData(listOf(1.0,2.0)) - } - }) - } - - - val graphDef = GraphDef { - Node(nodeDef) - Node(valueNodeDef) - - } - - val tfGraph = TensorflowIRGraph(graphDef, onnxops) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = nodeDef,graph = tfGraph) - shape.forEachIndexed { i,value -> - val sizeAtRule = org.nd4j.codegen.ir.tensorflow.sizeAtRule(dimensionIndex = i, outputAttributeName = "output", inputFrameworkAttributeName = "inputs") - val sizeAtRuleResult = sizeAtRule.convertAttributes(mappingCtx = mappingContext) - assertEquals(1,sizeAtRuleResult.size) - assertEquals(value,sizeAtRuleResult[0].int64Value) - - } - - } - - - @Test - fun testConditionalIndex() { - - val opDef = onnxops.findOp("AddN") - val strings = listOf("value","falseValue") - //when item is equal to value return element at index 0 - //when item is not equal to value return element at index 1 - val assertionValue = mapOf("value" to 1,"falseValue" to 0) - val trueIndex = 0 - val falseIndex = 1 - val listOfItemsForTesting = listOf(1,0,2,3) - //true and false case with index 1 - for(string in strings) { - val nodeDef = NodeDef { - op = "AddN" - Input("inputs") - Input("y") - name = "test" - Attribute(name = "N",value = AttrValue { - name = "N" - list = ListValue { - IntItems(listOfItemsForTesting) - } - }) - Attribute(name = "T",value = AttrValue { - name = "T" - s = ByteString.copyFrom(string.toByteArray(Charset.defaultCharset())) - }) - } - - val graphDef = GraphDef { - Node(nodeDef) - } - - val tfGraph = TensorflowIRGraph(graphDef, onnxops) - - - val mappingContext = TensorflowMappingContext(opDef = opDef,node = nodeDef,graph = tfGraph) - - val conditionalIndex = conditionalFieldValueIntIndexArrayRule( - outputAttribute = "N", - attributeNameOfListAttribute = "N", - targetValue = "value", trueIndex = trueIndex, falseIndex = falseIndex, - inputFrameworkStringNameToTest = "T") - - val ret = conditionalIndex.convertAttributes(mappingContext) - assertEquals(1,ret.size) - assertEquals((assertionValue[string] ?: - error("No value found with string value $string")).toLong(),ret[0].int64Value) - assertEquals("N",ret[0].name) - - } - - }*/ -} \ No newline at end of file diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/tensorflow/TestTensorflowIR.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/tensorflow/TestTensorflowIR.kt deleted file mode 100644 index 512f04d74..000000000 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/tensorflow/TestTensorflowIR.kt +++ /dev/null @@ -1,9021 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import junit.framework.Assert.assertEquals -import junit.framework.Assert.assertTrue -import org.apache.commons.io.IOUtils -import org.junit.jupiter.api.Test -import org.nd4j.autodiff.samediff.SameDiff -import org.nd4j.codegen.ir.ImportGraph -import org.nd4j.codegen.ir.registry.OpMappingRegistry -import org.nd4j.codegen.ir.registry.OpRegistryHolder -import org.nd4j.common.io.ClassPathResource -import org.nd4j.ir.OpNamespace -import org.nd4j.linalg.api.ndarray.INDArray -import org.nd4j.linalg.api.ops.DynamicCustomOp -import org.nd4j.linalg.api.ops.impl.transforms.BinCount -import org.nd4j.linalg.api.ops.impl.transforms.floating.RSqrt -import org.nd4j.linalg.factory.Nd4j -import org.nd4j.linalg.profiler.ProfilerConfig -import org.nd4j.shade.protobuf.ByteString -import org.nd4j.tensorflow.conversion.graphrunner.GraphRunner -import org.tensorflow.framework.* -import java.lang.IllegalStateException -import java.nio.charset.Charset -import kotlin.math.max - -data class GraphInput(val graphDef: GraphDef,val inputNames: List,val outputNames: List, - val inputArrays: Map,val dynamicArrays: Map) - -class TestTensorflowIR { - val declarations = TensorflowOpDeclarations - - @Test - fun testTensorflowAbs() { - val opDef = tensorflowOps.findOp("Abs") - val nodeDef = NodeDef { - op = "Abs" - name = "test" - Input("x") - } - - - - - val x = NodeDef { - op = "Const" - name = "x" - Attribute(name = "value",value = AttrValue { - tensor = TensorProto.getDefaultInstance() - }) - } - - - val graphDef = GraphDef { - Node(nodeDef) - Node(x) - } - - val tensorflowNode = TensorflowIRNode(nodeDef, opDef) - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val absMappingProcess = OpRegistryHolder.tensorflow().lookupOpMappingProcess(inputFrameworkOpName = "Abs") - - val mappingContext = TensorflowMappingContext(opDef = opDef,node = nodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val input = absMappingProcess.applyProcess(mappingContext) - println(input) - - } - - - @Test - fun loadModelTest() { - val importGraph = ImportGraph() - val inputs = listOf("input_0", "input_1") - val content = IOUtils.toByteArray(ClassPathResource("lenet_frozen.pb").inputStream) - val graphDef = GraphDef.parseFrom(content) - val irGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val importedModel = importGraph.importGraph(irGraph = irGraph,importOverride = null,opFilter = null,opMappingRegistry = OpRegistryHolder.tensorflow()) - println(importedModel) - } - - - @Test - fun testRegistry() { - val registry = OpRegistryHolder.tensorflow() - val mappingProcess = registry.lookupOpMappingProcess("Conv2D") - println(mappingProcess) - } - - - @Test - fun testOpOrdering() { - val tensorflowOpNames = tensorflowOpRegistry.inputFrameworkOpNames() - val bannedOps = setOf("Assert","RandomUniformInt","ResizeArea","UnsortedSegmentProd","UnsortedSegmentMin","SpaceToBatch", - "ResizeNearestNeighbor","Dilation2D","Bitcast","LinSpace","UnsortedSegmentSum", - "TensorArrayScatter","OneHot","UnsortedSegmentMax","TopKV2","TopK","Range","HistogramFixedWidth","ClipByValue","ResizeBilinear","Bincount","SplitV") - - tensorflowOpNames.forEach { opName -> - if(tensorflowOpRegistry.hasMappingOpProcess(opName)) { - val opDef = tensorflowOps.findOp(opName) - println("Processing op name $opName") - - val nodeBuilder = NodeDef.newBuilder() - nodeBuilder.name = opName - val graphBuilder = GraphDef.newBuilder() - nodeBuilder.op = opName - val attrNames = opDef.attrList.map {attrDef -> attrDef.name } - - //convert to a default case + return graph in new method - opDef.inputArgList.forEach { inputArgDef -> - val inputNumberAttr = inputArgDef.numberAttr - val numAttributeValue = if(!inputNumberAttr.isEmpty()) max(opDef.attrList.find { attrDef -> attrDef.name == inputNumberAttr }!!.minimum,2) else 1 - val typeAttrName = if(inputArgDef.typeAttr.isNotEmpty()) inputArgDef.typeAttr else "T" - val typeAttrValue = if(inputArgDef.typeAttr.isNotEmpty() && attrNames.contains(inputArgDef.typeAttr)) - opDef.attrList.first { attrDef -> attrDef.name == inputArgDef.typeAttr }.defaultValue.type else inputArgDef.type - for(i in 0 until numAttributeValue) { - val listOfFloats = mutableListOf() - val listOfInts = mutableListOf() - val listOfDoubles = mutableListOf() - val listOfBools = mutableListOf() - val listOfLongs = mutableListOf() - //the largest tensors we're likely to touch are 5d - for(i in 0 until (1 * 2 * 3 * 4 * 5 * 6)) { - listOfFloats.add(i.toFloat()) - listOfInts.add(i) - listOfDoubles.add(i.toDouble()) - listOfBools.add(true) - listOfLongs.add(i.toLong()) - } - - val nodeName = if(i <= 0) inputArgDef.name else inputArgDef.name + "$i" - nodeBuilder.addInput(nodeName) - - when(typeAttrValue) { - DataType.DT_DOUBLE -> { - //add placeholders for all parameters - val placeHolder = NodeDef { - name = nodeName - op = "Const" - Attribute(name = typeAttrName,value = AttrValue { - type = DataType.DT_DOUBLE - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf(1,2,3,4,5,6)) - DoubleData(listOfDoubles) - DataType(DataType.DT_DOUBLE) - } - }) - - } - - graphBuilder.addNode(placeHolder) - } - - DataType.DT_FLOAT -> { - //add placeholders for all parameters - val placeHolder = NodeDef { - name = nodeName - op = "Const" - Attribute(name = typeAttrName,value = AttrValue { - type = DataType.DT_FLOAT - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf(1,2,3,4,5,6)) - FloatData(listOfFloats) - DataType(DataType.DT_FLOAT) - } - }) - - } - - graphBuilder.addNode(placeHolder) - } - - DataType.DT_BOOL -> { - //add placeholders for all parameters - val placeHolder = NodeDef { - name = nodeName - op = "Const" - Attribute(name = typeAttrName,value = AttrValue { - type = DataType.DT_BOOL - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf(1,2,3,4,5,6)) - BooleanData(listOfBools) - DataType(DataType.DT_BOOL) - } - }) - - } - - graphBuilder.addNode(placeHolder) - } - - DataType.DT_INT16 -> { - //add placeholders for all parameters - val placeHolder = NodeDef { - name = nodeName - op = "Const" - Attribute(name = typeAttrName,value = AttrValue { - type = DataType.DT_INT16 - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf(1,2,3,4,5,6)) - Int32Data(listOfInts) - DataType(DataType.DT_INT16) - } - }) - - } - - graphBuilder.addNode(placeHolder) - } - - DataType.DT_INT32 -> { - //add placeholders for all parameters - val placeHolder = NodeDef { - name = nodeName - op = "Const" - Attribute(name = typeAttrName,value = AttrValue { - type = DataType.DT_INT32 - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf(1,2,3,4,5,6)) - Int32Data(listOfInts) - DataType(DataType.DT_INT32) - } - }) - - } - - graphBuilder.addNode(placeHolder) - } - - DataType.DT_INT64 -> { - //add placeholders for all parameters - val placeHolder = NodeDef { - name = nodeName - op = "Const" - Attribute(name = typeAttrName,value = AttrValue { - type = DataType.DT_INT64 - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf(1,2,3,4,5,6)) - Int64Data(listOfLongs) - DataType(DataType.DT_INT64) - } - }) - - } - - graphBuilder.addNode(placeHolder) - } - - DataType.DT_STRING -> { - //add placeholders for all parameters - val placeHolder = NodeDef { - name = nodeName - op = "Const" - Attribute(name = typeAttrName,value = AttrValue { - type = DataType.DT_DOUBLE - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf(1,2,3,4,5,6)) - FloatData(listOfFloats) - DataType(DataType.DT_DOUBLE) - } - }) - - } - - graphBuilder.addNode(placeHolder) - } - else -> { - - //add placeholders for all parameters - val placeHolder = NodeDef { - name = nodeName - op = "Const" - Attribute(name = typeAttrName,value = AttrValue { - type = DataType.DT_DOUBLE - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf(1,2,3,4,5,6)) - DoubleData(listOfDoubles) - DataType(DataType.DT_DOUBLE) - } - }) - - } - - graphBuilder.addNode(placeHolder) - - } - - - } - } - - } - - - opDef.attrList.forEach { attr -> - if(attr.hasMinimum or attr.type.contains("list")) { - //it varies whether lists have minimums or not (some should) - //defaulting to size 4 or the minimum will hit most use cases - val listSize = max(attr.minimum,5) - when(attr.type) { - "list(int)" -> { - val attrList = ArrayList() - for(i in 0 until listSize) { - attrList.add(i) - } - - nodeBuilder.putAttr(attr.name, AttrValue { - ListInts(attrList) - }) - } - "list(float)" -> { - val attrList = ArrayList() - for(i in 0 until listSize) { - attrList.add(i.toFloat()) - } - nodeBuilder.putAttr(attr.name, AttrValue { - ListFloats(attrList) - }) - } - else -> { - if(attr.hasMinimum) { - when(attr.type) { - "float" -> { - nodeBuilder.putAttr(attr.name, org.nd4j.codegen.ir.tensorflow.AttrValue { - f = attr.minimum.toFloat() - }) - - } - "int" -> { - nodeBuilder.putAttr(attr.name, org.nd4j.codegen.ir.tensorflow.AttrValue { - i = attr.minimum.toLong() - }) - } - } - } - else - nodeBuilder.putAttr(attr.name,attr.defaultValue) - - } - } - } - else - nodeBuilder.putAttr(attr.name,attr.defaultValue) - } - - - graphBuilder.addNode(nodeBuilder.build()) - val graph = graphBuilder.build() - - val importGraph = ImportGraph() - - - - if(!bannedOps.contains(opName)) { - val mappingProcess = tensorflowOpRegistry.lookupOpMappingProcess(opName) - val irGraph = TensorflowIRGraph(graphDef = graph,opDef = tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = nodeBuilder.build(),graph = irGraph,dynamicVariables = emptyMap()) - val mapResult = mappingProcess.applyProcess(mappingContext) - val groupedByArgType = mapResult.second.argDescriptorList.groupBy { keySelector -> keySelector.argType } - val sortedGroups = HashMap>() - groupedByArgType.forEach { (argType, argDescriptors) -> - sortedGroups[argType] = argDescriptors.sortedBy { argDescriptor -> argDescriptor.argIndex } - } - - //NOTE: Bitcast is in this list for examination outside of list offsets for assertions. We don't currently support data types for the test nodes. - sortedGroups.values.forEach { list -> run { - val namesEncountered = HashSet() - list.forEachIndexed { index, argDescriptor -> - //don't validate a name encountered more than once, this is probably an array - //note that we skip some ops here due to this assumption breaking for list types, we will test list types separately - if(!namesEncountered.contains(argDescriptor.name) && opName != "BatchToSpace" && !opName.contains("NonMaxSuppression") - && !bannedOps.contains(opName)) { - assertEquals("Arg index $index for arg descriptor name ${argDescriptor.name} for nd4j op ${mappingContext.nd4jOpName()} when arg index was actually ${argDescriptor.argIndex}. Full arg descriptor was ${argDescriptor}. Graph was ${graph}", - argDescriptor.argIndex, index) - namesEncountered.add(argDescriptor.name) - } - } - } - //SameDiff.importFrozenTF(irGraph.graphDef) - val sameDiffResult = importGraph.importGraph(irGraph = irGraph,importOverride = null,opFilter = null,opMappingRegistry = OpRegistryHolder.tensorflow()) - println("Processed op name $opName") - - } - } - - - } - } - } - - @Test - fun testTensorflowConv2dOld() { - val opDef = tensorflowOps.findOp("Conv2D") - val attrValue = AttrValue { - list = ListAttrValue(1,1,1,1) - } - - val dilationsAttr = AttrValue { - list = ListAttrValue(1,1,1,1) - } - - val dataFormatValue = AttrValue { - s = ByteString.copyFrom("NCHW", Charset.defaultCharset()) - } - - val paddingValue = AttrValue { - s = ByteString.copyFrom("SAME", Charset.defaultCharset()) - } - - val nodeDef = NodeDef { - Input("input") - Input("filter") - op = "Conv2D" - name = "input" - Attribute("strides",attrValue) - Attribute("data_format",dataFormatValue) - Attribute("padding",paddingValue) - Attribute("dilations",dilationsAttr) - } - - val tensorValue2 = TensorProto { - tensorShape = TensorShapeProto { - Dim(name = "0", size = 1) - Dim(name = "1", size = 5) - Dim(name = "2", size = 5) - Dim(name = "3", size = 6) - } - } - - val weightsNode = NodeDef { - op = "Const" - name = "filter" - Attribute("value", AttrValue { - tensor = tensorValue2 - }) - } - - val graphDef = GraphDef { - Node(nodeDef) - Node(weightsNode) - } - - - val tensorflowIRNode = TensorflowIRNode(nodeDef, opDef) - val conv2dMappingProcess = OpRegistryHolder.lookupOpMappingProcess(inputFrameworkName = "tensorflow",inputFrameworkOpName = "Conv2D") - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - """ - node { - name: "Lenet/conv1_1/Conv2D" - op: "Conv2D" - input: "Reshape" - input: "Lenet/conv1/weights" - attr { - key: "use_cudnn_on_gpu" - value { - b: true - } - } - attr { - key: "padding" - value { - s: "SAME" - } - } - attr { - key: "T" - value { - type: DT_FLOAT - } - } - attr { - key: "strides" - value { - list { - i: 1 - i: 1 - i: 1 - i: 1 - } - } - } - attr { - key: "data_format" - value { - s: "NHWC" - } - } - } - """.trimIndent() - - """ - node { - name: "Lenet/conv1/weights" - op: "Const" - attr { - key: "value" - value { - tensor { - dtype: DT_FLOAT - tensor_shape { - dim { - size: 5 - } - dim { - size: 5 - } - dim { - size: 1 - } - dim { - size: 6 - } - } - tensor_content: "\265`\023=\207\274\277<\345j\234<\2515\266<\217\001Y<\375\223\346<\375\236Q<\213\016D<\223s\376\272\313\2561<\352\374\303<\357\036{<\r1\272<\271\020\252\265;\232=\b<{a\236;\242j\243<\212\353\330<\227J\023=\026\210\252\271\b\035\263<\264;N<\372\215\227;\351g\226 tensorflowOpRegistry.lookupOpMappingProcess(tensorflowOpName)} - .forEach { - val tensorflowNamesMapped = HashSet() - val nd4jNamesMapped = HashSet() - //we can ignore dtype for now - nd4jNamesMapped.add("dtype") - val opDef = tensorflowOpRegistry.lookupNd4jOpDef(it.opName()) - val tensorflowOpDef = tensorflowOpRegistry.lookupInputFrameworkOpDef(it.inputFrameworkOpName()) - val tensorflowAssertionNames = HashSet() - tensorflowAssertionNames.addAll(tensorflowOpDef.inputArgList.map { arg -> arg.name }) - tensorflowAssertionNames.addAll(tensorflowOpDef.attrList.map { attr -> attr.name }) - val nd4jOpDefAssertions = HashSet() - nd4jOpDefAssertions.addAll(opDef.argDescriptorList.map { argDescriptor -> argDescriptor.name }) - val numRequiredInputsTf = tensorflowOpDef.inputArgCount - val nd4jInputs = opDef.argDescriptorList.filter { arg -> arg.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR }.count() - /** - * TODO: Grab total collection of mapped nd4j names - * as outputs and mapped tensorflow names as inputs. - * Compare the mapped names to the op definitions - * in nd4j and tensorflow respectively. - */ - it.tensorMappingRules().forEach { mappingRule -> - mappingRule.mappingNamesToPerform().forEach { mappingName -> - tensorflowNamesMapped.add(mappingName.value) - nd4jNamesMapped.add(mappingName.key) - } - } - - it.attributeMappingRules().forEach { mappingRule -> - mappingRule.mappingNamesToPerform().forEach { mappingName -> - tensorflowNamesMapped.add(mappingName.value) - nd4jNamesMapped.add(mappingName.key) - } - - mappingRule.mappingTransformerArgs().forEach {transformerArg -> - run { - transformerArg.value.forEach { argValue -> - nd4jNamesMapped.add(argValue.name) - - } - } - } - - } - - - tensorflowOpDef.inputArgList.map {input -> input.name}.forEach { inputName -> - assertTrue(tensorflowAssertionNames.contains(inputName)) - } - - tensorflowOpDef.attrList.filter { attrDef -> attrDef.type != "type" }.map {attrDef -> attrDef.name }.forEach { attrName -> - assertTrue(tensorflowAssertionNames.contains(attrName)) - } - - - - opDef.argDescriptorList.forEach { argDef -> - //only require it when the - - when(argDef.argType) { - OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR -> { - /** - * Nd4j typically has many optional inputs that can also double as attributes - * We need to allow for a bit of flexibility in how we handle op definitions. If they're not mapped 1 to 1, - * we just log a warning for unmapped inputs. Otherwise we can do an assertion. - */ - if(numRequiredInputsTf == nd4jInputs) - assertTrue("Nd4j op name ${opDef.name} with tensorflow mapping ${tensorflowOpDef.name} has missing mapping ${argDef.name}",nd4jNamesMapped.contains(argDef.name)) - else if(!nd4jNamesMapped.contains(argDef.name)) { - println("Warning: Nd4j op name ${opDef.name} with tensorflow mapping ${tensorflowOpDef.name} has missing mapping ${argDef.name}") - } - } - OpNamespace.ArgDescriptor.ArgType.INT32,OpNamespace.ArgDescriptor.ArgType.INT64 -> { - assertTrue("Nd4j op name ${opDef.name} with tensorflow mapping ${tensorflowOpDef.name} has missing mapping ${argDef.name}",nd4jNamesMapped.contains(argDef.name)) - } - OpNamespace.ArgDescriptor.ArgType.DOUBLE, OpNamespace.ArgDescriptor.ArgType.FLOAT -> { - assertTrue("Nd4j op name ${opDef.name} with tensorflow mapping ${tensorflowOpDef.name} has missing mapping ${argDef.name}",nd4jNamesMapped.contains(argDef.name)) - } - OpNamespace.ArgDescriptor.ArgType.BOOL -> { - assertTrue("Nd4j op name ${opDef.name} with tensorflow mapping ${tensorflowOpDef.name} has missing mapping ${argDef.name}",nd4jNamesMapped.contains(argDef.name)) - } - } - - } - - } - } - - @Test - fun testInputOutputNames() { - val tensorflowOpNames = tensorflowOpRegistry.inputFrameworkOpNames() - val nd4jOpNames = tensorflowOpRegistry.nd4jOpNames() - tensorflowOpRegistry.mappingProcessNames().map { - tensorflowOpRegistry.lookupOpMappingProcess(it) - }.forEach { - println("Beginning processing of op ${it.inputFrameworkOpName()} and nd4j op ${it.opName()}") - assertTrue(tensorflowOpNames.contains(it.inputFrameworkOpName())) - assertTrue(nd4jOpNames.contains(it.opName())) - val nd4jOpDef = tensorflowOpRegistry.lookupNd4jOpDef(it.opName()) - val tensorflowOpDef = tensorflowOpRegistry.lookupInputFrameworkOpDef(it.inputFrameworkOpName()) - val inputNameArgDefs = nd4jOpDef.argDescriptorList.filter { - argDef -> argDef.argType == OpNamespace.ArgDescriptor.ArgType.INPUT_TENSOR - }.map { argDef -> argDef.name } - - val inputFrameworkOpDefNames = tensorflowOpDef.inputArgList.map { tfOpDef -> tfOpDef.name} - - val nd4jArgDefNames = nd4jOpDef.argDescriptorList.map { nd4jArgDef -> nd4jArgDef.name } - val tfAttrNames = tensorflowOpDef.attrList.map { tfAttr -> tfAttr.name } - it.tensorMappingRules().forEach { tensorRules -> - println("Running tensor mapping rule ${tensorRules.name()} for op ${it.inputFrameworkOpName()} and nd4j op name ${it.opName()}") - run { - tensorRules.mappingNamesToPerform().forEach { tensorRule -> - run { - println("Testing assertion for nd4j name ${tensorRule.key} and input name ${tensorRule.value}") - assertTrue(inputNameArgDefs.contains(tensorRule.key)) ?: error("Failed on inputArgName ${tensorRule.key}") - assertTrue(inputFrameworkOpDefNames.contains(tensorRule.value)) ?: error("Failed on inputArgName ${tensorRule.value}") - } - - } - } - - } - - println("Running attribute mapping rules for ${it.opName()} and input op name ${it.inputFrameworkOpName()}") - it.attributeMappingRules().forEach { attrRule -> - run { - attrRule.mappingNamesToPerform().forEach { attrMapping -> - run { - println("Testing nd4j name ${attrMapping.key} and input framework name ${attrMapping.value}") - assertTrue(nd4jArgDefNames.contains(attrMapping.key) || inputNameArgDefs.contains(attrMapping.key)) - assertTrue(tfAttrNames.contains(attrMapping.value) || inputFrameworkOpDefNames.contains(attrMapping.value)) - } - - } - } - } - - } - } - - @Test - fun testOpExecution() { - Nd4j.getRandom().setSeed(12345) - val scalarInputs = mapOf( - "abs" to -1.0, - "acos" to 1.0, - "acosh" to 1.0, - "asin" to 1.0, - "asinh" to 1.0, - "atan" to 1.0, - "atanh" to 0.5, - "ceil" to 1.0, - "copy" to 1.0, - "cos" to 1.0, - "cosh" to 1.0, - "erf" to 1.0, - "elu" to 1.0, - "erfc" to 1.0, - "exp" to 1.0, - "expm1" to 1.0, - "floor" to 1.0, - "identity" to 1.0, - "isfinite" to 1.0, - "isinf" to 1.0, - "isnan" to 1.0, - //"identity_n" to 1.0, - "log" to 1.0, - "log1p" to 1.0, - "neg" to 1.0, - "ones_as" to 1.0, - "Reciprocal" to 1.0, - "rank" to 1.0, - "relu6" to 1.0, - "rint" to 1.0, - "round" to 1.0, - "rsqrt" to 1.0, - "sigmoid" to 1.0, - "sign" to 1.0, - "size" to 1.0, - "sin" to 1.0, - "sinh" to 1.0, - "square" to 1.0, - "sqrt" to 1.0, - "tan" to 1.0, - "tanh" to 1.0, - "selu" to 1.0, - "softsign" to 1.0, - "softplus" to 1.0, - "zeroslike" to 1.0) - - val singleInputOps = scalarInputs.keys - - val pairWiseInputs = mapOf( - "add" to listOf(1.0,1.0), - "divide" to listOf(1.0,1.0), - "greater" to listOf(1.0,1.0), - "less" to listOf(1.0,1.0), - "less_equal" to listOf(1.0,1.0), - "multiply" to listOf(1.0,1.0), - "floordiv" to listOf(1.0,1.0), - "mod" to listOf(1.0,1.0), - "squaredsubtract" to listOf(1.0,1.0), - "not_equals" to listOf(1.0,1.0), - "realdiv" to listOf(1.0,1.0), - "tf_atan2" to listOf(1.0,1.0), - "maximum" to listOf(0.0,1.0), - "min_pairwise" to listOf(1.0,1.0), - "greater_equal" to listOf(1.0,1.0), - "equals" to listOf(1.0,1.0), - "min_pairwise" to listOf(1.0,1.0), - "divide_no_nan" to listOf(1.0,1.0), - "zeta" to listOf(2.0,3.0) - - - ) - - - - - - /** - * Control flow ops - */ - - /** - * Random distribution ops - */ - - - /** - * Creation ops - * Empty - * CopyHost - * Linspace - * OnesLike - */ - - /** - * Scatter ops: - * scatter_div - * scatter_add - * scatter_sub - * scatter_min - * scatter_mul - * scatter_update - * scatter_nd - * scatter_nd_add - * scatter_nd_sub - * scatter_nd_update - */ - - - - - val pairWiseIntOps = mapOf( - "fmod" to listOf(1,1), - "rshift_bits" to listOf(1,1), - "truncatediv" to listOf(1,1), - "bitwise_and" to listOf(1,1), - "bitwise_or" to listOf(1,1), - "bitwise_xor" to listOf(1,1), - "shift_bits" to listOf(1,1) - ) - - val pairWiseNames = pairWiseInputs.keys - - - val booleanReduceOps = mapOf( - "all" to Nd4j.create(listOf(true,false,true,false).toBooleanArray()).reshape(2,2), - "any" to Nd4j.create(listOf(true,false,true,false).toBooleanArray()).reshape(2,2) - ) - - val singularReduceOps = mapOf( - "reduce_mean" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_prod" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_min" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_sum" to Nd4j.linspace(1,4,4).reshape(2,2), - "reduce_max" to Nd4j.linspace(1,4,4).reshape(2,2) - ) - - - - - val mappedOps = setOf( - "Assert", - "gather_nd", - "lstmBlock", - "lstmBlockCell", - "gruCell", - "igamma", - "igammac", - "lgamma", - "reduce_logsumexp", - "check_numerics", - "adjust_hue", - "adjust_saturation", - "reverse_sequence", - "depthwise_conv2d", - "resize_nearest_neighbor", - "scatter_nd", - "resize_area", - "rgb_to_hsv", - "resize_bicubic", - "resize_bilinear", - "listdiff", - "mirror_pad", - "histogram_fixed_width", - "extract_image_patches", - "ClipByValue", - "crop_and_resize", - "broadcast_dynamic_shape", - "broadcastgradientargs", - "lrn", - "batch_to_space_nd", - "space_to_batch_nd", - "draw_bounding_boxes", - "fused_batch_norm", - "conv3dnew", - "avgpool3dnew", - "maxpool3dnew", - "create", - "slice", - "strided_slice", - "select", - "compare_and_bitpack", - "bincount", - "broadcast_to", - "biasadd", - "condition", - "avgpool2d", - "maxpool2d", - "conv2d", - "dilation2d", - "batch_to_space", - "space_to_batch", - "dynamic_partition", - "dynamic_stitch", - "softmax", - "mergesum", - "matrix_set_diag", - "matrix_diag_part", - "identity_n", - "split", - "split_v", - "shapes_of", - "squeeze", - "bitcast", - "merge_sum", - "tile", - "matmul", - "range", - "lin_space", - "gather", - "betainc", - "concat", - "stack", - "unstack", - "merge", - "leakyrelu", - "shape_of", - "roll", - "reverse", - "relu", - "relu6", - "argmin", - "argmax", - "cross", - "cumsum", - "cumprod", - "diag", - "diag_part", - "digamma", - "depth_to_space", - "expand_dims", - "toggle_bits", - "invert_permutation", - //"enter", TODO: deal with frames or maybe ignore? - //"exit", - "in_top_k", - "top_k", - "lu", - "matrix_inverse", - "matrix_determinant", - "solve", - "triangular_solve", - "log_matrix_determinant", - "cholesky", - "reshape", - "noop", - "nth_element", - "non_max_suppression_overlaps", - "non_max_suppression", - "non_max_suppression_v3", - "onehot", - "pad", - "pow", - "transpose", - "space_to_depth", - "Where", - "unsorted_segment_max", - "unsorted_segment_min", - "unsorted_segment_prod", - "unsorted_segment_sum", - "unique_with_counts", - "unique", - "boolean_and", - "boolean_not", - "boolean_or", - "segment_mean", - "segment_min", - "segment_max", - "segment_prod", - "segment_sum" - - //"scatter_add", Skipping due to different op validation - //"scatter_sub", Skipping due to different op validation - //"scatter_update", Skipping due to different op validation - //"scatter_nd" Skipping due to different op validation - ) - - - - - //Skipping due to using references rather than tensors - //"scatter_nd_add", - //"scatter_nd_sub", - // "scatter_nd_update" - // //"scatter_min", - // //"scatter_mul",) - - val singularReduceNames = singularReduceOps.keys - val testedOps = HashSet() - //skip testing control flow - val controlFlowOps = setOf("Switch","While","placeholder","next_iteration","enter","exit","loop_cond") - val resourceOps = setOf("stack_list","size_list","scatter_list","read_list","split_list","gather_list") - val refOps = setOf("assign","scatter_add","scatter_sub","scatter_update") - val randomOps = setOf("random_gamma","random_crop","random_normal","random_poisson","random_shuffle","randomuniform") - testedOps.addAll(randomOps) - testedOps.addAll(controlFlowOps) - testedOps.addAll(resourceOps) - testedOps.addAll(refOps) - val importGraph = ImportGraph() - - tensorflowOpRegistry.mappingProcessNames().map { name -> - tensorflowOpRegistry.lookupOpMappingProcess(name) - }.forEach { mappingProcess -> - val nd4jOpDef = tensorflowOpRegistry.lookupNd4jOpDef(mappingProcess.opName()) - val tensorflowOpDef = tensorflowOpRegistry.lookupInputFrameworkOpDef(mappingProcess.inputFrameworkOpName()) - - if(singleInputOps.contains(nd4jOpDef.name) && tensorflowOpDef.name != "Variable" && tensorflowOpDef.name != "VariableV2" && tensorflowOpDef.name != "Const") { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - val tensorflowGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,emptyMap(),OpRegistryHolder.tensorflow()).enableDebugMode()!! - Nd4j.getExecutioner().setProfilingConfig(ProfilerConfig.builder() - .stackTrace(true).build()) - val xVal = Nd4j.scalar(scalarInputs[mappingProcess.opName()]).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = listOf("x"),outputNames = listOf("output")) - val inputs = mapOf("x" to xVal) - if(!mappedGraph.hasVariable("output")) - throw IllegalStateException("No output variable found. Variables include ${mappedGraph.variables}") - val tfResults = tensorflowRunner.run(inputs) - val results = mappedGraph.output(inputs,"output") - val tfOutput = tfResults["output"]!! - assertTrue(tfOutput.isScalar) - val nd4jOutput = results["output"]!! - assertTrue(nd4jOutput.isScalar) - assertEquals("Function ${nd4jOpDef.name} failed with input $xVal",nd4jOutput.getDouble(0), tfOutput.getDouble(0),1e-3) - testedOps.add(nd4jOpDef.name) - } - else if(singularReduceNames.contains(nd4jOpDef.name)) { - listOf(listOf(0),listOf(-1),listOf(0,1)).forEach { dimensions -> - listOf(true,false).forEach { keepDim -> - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val opNode = NodeDef { - Input("x") - Input("dimensions") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tidx",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("keep_dims",AttrValue { - b = keepDim - }) - } - - val tensorNode2 = NodeDef { - op = "Const" - name = "dimensions" - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(dimensions) - dtype = DataType.DT_INT32 - tensorShape = TensorShapeProto { - Dims(listOf(1,dimensions.size.toLong())) - } - } - }) - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(tensorNode) - Node(tensorNode2) - Node(opNode) - } - - val mappingProcess = tensorflowOpRegistry.lookupOpMappingProcess(tensorflowOpDef.name) - val tensorflowGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,emptyMap(),OpRegistryHolder.tensorflow())!! - val xVal = singularReduceOps[mappingProcess.opName()]!!.castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = listOf("x"),outputNames = listOf("output")) - val inputs = mapOf("x" to xVal) - val results = mappedGraph.output(inputs,"output") - val tfResults = tensorflowRunner.run(inputs) - //2 dimensions means sum the whole array, sometimes there are subtle differences in the shape like 1,1 vs a zero length array which is effectively the same thing - if(dimensions.size < 2) - assertEquals("Function ${nd4jOpDef.name} failed with input $xVal and dimension ${dimensions}",tfResults["output"]!!, results["output"]!!) - else - assertEquals("Function ${nd4jOpDef.name} failed with input $xVal and dimension ${dimensions}",tfResults["output"]!!.reshape(1,1), results["output"]!!.reshape(1,1)) - - } - - } - - testedOps.add(nd4jOpDef.name) - - } else if(booleanReduceOps.keys.contains(nd4jOpDef.name)) { - listOf(listOf(0),listOf(-1),listOf(0,1)).forEach { dimensions -> - listOf(true,false).forEach { keepDim -> - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - val opNode = NodeDef { - Input("x") - Input("dimensions") - op = tensorflowOpDef.name - name = "output" - - Attribute("Tidx",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("keep_dims",AttrValue { - b = keepDim - }) - } - - val tensorNode2 = NodeDef { - op = "Const" - name = "dimensions" - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(dimensions) - dtype = DataType.DT_INT32 - tensorShape = TensorShapeProto { - Dims(listOf(1,dimensions.size.toLong())) - } - } - }) - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(tensorNode) - Node(tensorNode2) - Node(opNode) - } - - val mappingProcess = tensorflowOpRegistry.lookupOpMappingProcess(tensorflowOpDef.name) - val tensorflowGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,emptyMap(),OpRegistryHolder.tensorflow())!! - val xVal = booleanReduceOps[mappingProcess.opName()]!! - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = listOf("x"),outputNames = listOf("output")) - val inputs = mapOf("x" to xVal) - val results = mappedGraph.output(inputs,"output") - val tfResults = tensorflowRunner.run(inputs) - //2 dimensions means sum the whole array, sometimes there are subtle differences in the shape like 1,1 vs a zero length array which is effectively the same thing - if(dimensions.size < 2) - assertEquals("Function ${nd4jOpDef.name} failed with input $xVal and dimension ${dimensions}",tfResults["output"]!!, results["output"]!!) - else - assertEquals("Function ${nd4jOpDef.name} failed with input $xVal and dimension ${dimensions}",tfResults["output"]!!.reshape(1,1), results["output"]!!.reshape(1,1)) - - } - - } - - testedOps.add(nd4jOpDef.name) - - } else if(pairWiseNames.contains(nd4jOpDef.name)) { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val tensorNode2 = NodeDef { - op = "Placeholder" - name = "y" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val opNode = NodeDef { - Input("x") - Input("y") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - Node(tensorNode2) - } - - val mappingProcess = tensorflowOpRegistry.lookupOpMappingProcess(tensorflowOpDef.name) - val tensorflowGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,dynamicVariables = mapOf("y" to TensorProto { - dtype = DataType.DT_DOUBLE - DoubleData(listOf(1.0)) - Shape(listOf(1,1)) - }),OpRegistryHolder.tensorflow())!! - - val xVal = Nd4j.scalar(pairWiseInputs[mappingProcess.opName()]!![0]) - .reshape(1,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val yVal = Nd4j.scalar(pairWiseInputs[mappingProcess.opName()]!![1]) - .reshape(1,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = listOf("x","y"),outputNames = listOf("output")) - val inputs = mapOf("x" to xVal,"y" to yVal) - val results = mappedGraph.output(inputs,"output") - val tfResults = tensorflowRunner.run(inputs) - assertEquals("Function ${nd4jOpDef.name} failed with input $xVal",tfResults["output"]!!.reshape(1,1), results["output"]!!.reshape(1,1)) - testedOps.add(nd4jOpDef.name) - - } else if(pairWiseIntOps.contains(nd4jOpDef.name)) { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val tensorNode2 = NodeDef { - op = "Placeholder" - name = "y" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val opNode = NodeDef { - Input("x") - Input("y") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - Node(tensorNode2) - } - - val tensorflowGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,emptyMap(),OpRegistryHolder.tensorflow())!! - val xVal = Nd4j.scalar(pairWiseIntOps[mappingProcess.opName()]!![0]) - .reshape(1,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val yVal = Nd4j.scalar(pairWiseIntOps[mappingProcess.opName()]!![1]) - .reshape(1,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = listOf("x","y"),outputNames = listOf("output")) - val inputs = mapOf("x" to xVal,"y" to yVal) - val results = mappedGraph.output(inputs,"output") - val tfResults = tensorflowRunner.run(inputs) - assertEquals("Function ${nd4jOpDef.name} failed with input $xVal",tfResults["output"]!!.reshape(1,1), results["output"]!!.reshape(1,1)) - testedOps.add(nd4jOpDef.name) - - } else if(mappedOps.contains(mappingProcess.opName())) { - val graphInputList = graphForOp(nd4jOpName = mappingProcess.opName(),inputFrameworkOpName = mappingProcess.inputFrameworkOpName()) - graphInputList.forEach { graphInput -> - val tensorflowGraph = TensorflowIRGraph(graphInput.graphDef, tensorflowOps) - val dynamicOpsMap = HashMap() - graphInput.inputArrays.forEach { k, v -> - dynamicOpsMap[k] = convertNDArrayToTensorflowTensor(v) - } - - //NOTE: The output name here is different than the output names from samediff because we want every array from tensorflow for assertion purposes. - //The outputs from samediff might be slightly different (eg: not have every output tensorflow does or more) - - //tf2 ops don't currently work in nd4j-tensorflow and can't be verified - val tf2Ops = setOf("CheckNumericsV2","FusedBatchNormV3") - //these ops reflect ops that should generally be tested other ways and are usually tested down below - val bannedOps = setOf("noop","unique","unique_with_counts","matrix_determinant","log_matrix_determinant","Assert","split_v","identity_n","dynamic_partition","dynamic_stitch","draw_bounding_boxes","fused_batch_norm") - if(!bannedOps.contains(mappingProcess.opName()) && !tf2Ops.contains(mappingProcess.inputFrameworkOpName())) { - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = graphInput.inputNames,outputNames = graphInput.outputNames) - - - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,dynamicOpsMap,OpRegistryHolder.tensorflow()) - assertEquals("Input name mismatch with input array elements",graphInput.inputArrays.keys,graphInput.inputNames.toSet()) - - val tfResults = tensorflowRunner.run(graphInput.inputArrays) - val results = mappedGraph!!.output(graphInput.inputArrays,graphInput.outputNames) - if(mappingProcess.opName() == "bincount") { - val inputVal = Nd4j.create(doubleArrayOf(1.0, 2.0, 0.0, 1.0, 2.0, 2.0, 1.0, 2.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val sizeVal = Nd4j.create(doubleArrayOf(3.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val weightVal = Nd4j.create(doubleArrayOf(1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - println(Nd4j.getExecutioner().exec(DynamicCustomOp.builder("bincount").addInputs(inputVal,weightVal).addIntegerArguments(0,3).build())[0]) - println() - } - assertEquals("Function ${nd4jOpDef.name} failed with input ${graphInput.inputNames} " + - "with tfValue of shape ${tfResults.values.first().shapeInfoToString()} and nd4j ${results.values.first().shapeInfoToString()} and ${graphInput}" - ,tfResults.values.first(), results.values.first()) - } else if(mappingProcess.opName() == "unique_with_counts" || mappingProcess.opName() == "unique") { - //note: this is a separate case since the results are equal, minus dimensions - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = graphInput.inputNames,outputNames = graphInput.outputNames) - - - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,dynamicOpsMap,OpRegistryHolder.tensorflow()) - assertEquals("Input name mismatch with input array elements",graphInput.inputArrays.keys,graphInput.inputNames.toSet()) - - val tfResults = tensorflowRunner.run(graphInput.inputArrays) - val results = mappedGraph!!.output(graphInput.inputArrays,graphInput.outputNames) - assertEquals("Function ${nd4jOpDef.name} failed with input ${graphInput.inputNames}",tfResults.values.first().ravel(), results.values.first().ravel()) - }//slight difference in scalar result, doesn't matter in practice - else if(mappingProcess.opName() == "matrix_determinant" || mappingProcess.opName() == "log_matrix_determinant") { - //note: this is a separate case since the results are equal, minus dimensions - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = graphInput.inputNames,outputNames = graphInput.outputNames) - - - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,dynamicOpsMap,OpRegistryHolder.tensorflow()) - assertEquals("Input name mismatch with input array elements",graphInput.inputArrays.keys,graphInput.inputNames.toSet()) - - if(mappingProcess.opName() == "matrix_determinant") { - val tfResults = tensorflowRunner.run(graphInput.inputArrays) - val results = mappedGraph!!.output(graphInput.inputArrays,graphInput.outputNames) - assertEquals("Function ${nd4jOpDef.name} failed with input ${graphInput.inputNames}",tfResults["output"]!!.ravel().getDouble(0), results["output"]!!.ravel().getDouble(0),1e-3) - - } - } - else if(mappingProcess.opName() == "split_v" || mappingProcess.opName() == "identity_n" || mappingProcess.opName() == "dynamic_partition"|| mappingProcess.opName() == "dynamic_stitch") { - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = graphInput.inputNames,outputNames = graphInput.outputNames) - - - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,dynamicOpsMap,OpRegistryHolder.tensorflow()) - assertEquals("Input name mismatch with input array elements",graphInput.inputArrays.keys,graphInput.inputNames.toSet()) - - val tfResults = tensorflowRunner.run(graphInput.inputArrays) - val results = mappedGraph!!.output(graphInput.inputArrays,graphInput.outputNames) - assertEquals("Function ${nd4jOpDef.name} failed with input ${graphInput.inputNames}",tfResults, results) - - } else if(mappingProcess.opName() == "draw_bounding_boxes") { - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = graphInput.inputNames,outputNames = graphInput.outputNames) - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,dynamicOpsMap,OpRegistryHolder.tensorflow()) - assertEquals("Input name mismatch with input array elements",graphInput.inputArrays.keys,graphInput.inputNames.toSet()) - val tfResults = tensorflowRunner.run(graphInput.inputArrays) - val results = mappedGraph!!.output(graphInput.inputArrays,graphInput.outputNames) - assertEquals("Function ${nd4jOpDef.name} failed with input ${graphInput.inputNames}",tfResults, results) - - } - else if(mappingProcess.opName() == "fused_batch_norm") { - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = graphInput.inputNames,outputNames = graphInput.outputNames) - - - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,dynamicOpsMap,OpRegistryHolder.tensorflow()) - assertEquals("Input name mismatch with input array elements",graphInput.inputArrays.keys,graphInput.inputNames.toSet()) - - val tfResults = tensorflowRunner.run(graphInput.inputArrays) - val results = mappedGraph!!.output(graphInput.inputArrays,graphInput.outputNames) - assertEquals("Function ${nd4jOpDef.name} failed with input ${graphInput.inputNames}",tfResults["y"], results["y"]) - - } - - else if(!bannedOps.contains(mappingProcess.opName()) && !tf2Ops.contains(mappingProcess.inputFrameworkOpName())) { - //note that log outputs 2 results and the 2nd one is the one we need. The first result is a sign. - val tensorflowRunner = TensorflowIRGraphRunner(irGraph = tensorflowGraph,inputNames = graphInput.inputNames,outputNames = graphInput.outputNames) - - - val mappedGraph = importGraph.importGraph(tensorflowGraph,null,null,dynamicOpsMap,OpRegistryHolder.tensorflow()) - assertEquals("Input name mismatch with input array elements",graphInput.inputArrays.keys,graphInput.inputNames.toSet()) - - val tfResults = tensorflowRunner.run(graphInput.inputArrays) - val results = mappedGraph!!.output(graphInput.inputArrays,graphInput.outputNames) - assertEquals("Function ${nd4jOpDef.name} failed with input ${graphInput.inputNames}",tfResults["finalResult"]!!.ravel().getDouble(0), results["finalResult"]!!.ravel().getDouble(0),1e-3) - - } - - } - - testedOps.add(nd4jOpDef.name) - - } - } - - val differenceOfSet = tensorflowOpRegistry.mappedNd4jOpNames() - testedOps - println("Ops left to test is ${differenceOfSet.size} and ops are $differenceOfSet with total ops ran ${testedOps.size}") - println("Note we skipped ${controlFlowOps.size} testing control flow ops named $controlFlowOps") - println("Note we skipped ${resourceOps.size} testing resource ops named $resourceOps due to resources being handled differently than normal tensors") - println("Note we skipped ${refOps.size} testing resource ops named $refOps due to references being handled differently than normal tensors") - println("Note we skipped ${randomOps.size} testing resource ops named $randomOps due to random not being consistently testable. This may change in the short term.") - - } - - - - - - fun graphForOp(nd4jOpName: String,inputFrameworkOpName: String): List { - val tensorflowOpDef = tensorflowOpRegistry.lookupInputFrameworkOpDef(inputFrameworkOpName) - when (nd4jOpName) { - "check_numerics" -> { - val tensor = NodeDef { - name = "tensor" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("tensor") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("message",AttrValue { - s = ByteString.copyFrom("test message".toByteArray(Charset.defaultCharset())) - }) - } - - val graphDef = GraphDef { - Node(tensor) - Node(opNode) - } - - - - val xVal = Nd4j.create(floatArrayOf(1.0f,2.0f,3.0f)) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val inputs = mapOf("tensor" to xVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("tensor"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "gruCell" -> { - val x = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val hPrev = NodeDef { - name = "h_prev" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val wRu = NodeDef { - name = "w_ru" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val wC = NodeDef { - name = "w_c" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val bRu = NodeDef { - name = "b_ru" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val bc = NodeDef { - name = "b_c" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("x") - Input("h_prev") - Input("w_ru") - Input("w_c") - Input("b_ru") - Input("b_c") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val r = NodeDef { - name = "r" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val u = NodeDef { - name = "u" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val c = NodeDef { - name = "c" - Input("output:2") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val h = NodeDef { - name = "h" - Input("output:3") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val graphDef = GraphDef { - Node(x) - Node(hPrev) - Node(wRu) - Node(wC) - Node(bRu) - Node(bc) - Node(opNode) - Node(r) - Node(u) - Node(c) - Node(h) - } - - - - - val xVal = Nd4j.linspace(1,20,20).reshape(2,10) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val hPrevVal = Nd4j.linspace(1,8,8).reshape(2,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val wRuVal = Nd4j.linspace(1,112,112).reshape(14,8) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wcVal = Nd4j.linspace(1,56,56).reshape(14,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val bRuVal = Nd4j.linspace(1,8,8).reshape(8) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val bcVal = Nd4j.linspace(1,4,4).reshape(4) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val inputs = mapOf("x" to xVal,"h_prev" to hPrevVal,"w_ru" to wRuVal,"w_c" to wcVal,"b_ru" to bRuVal,"b_c" to bcVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("x","h_prev","w_ru","w_c","b_ru","b_c"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "lstmBlockCell" -> { - val x = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val csPrev = NodeDef { - name = "cs_prev" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val hPrev = NodeDef { - name = "h_prev" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val w = NodeDef { - name = "w" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val wci = NodeDef { - name = "wci" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val wcf = NodeDef { - name = "wcf" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val wco = NodeDef { - name = "wco" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val bias = NodeDef { - name = "b" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("x") - Input("cs_prev") - Input("h_prev") - Input("w") - Input("wci") - Input("wcf") - Input("wco") - Input("b") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("forget_bias",AttrValue { - f = 2.0f - }) - - Attribute("use_peephole",AttrValue { - b = false - }) - } - - - val i = NodeDef { - name = "i" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val cs = NodeDef { - name = "cs" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val f = NodeDef { - name = "f" - Input("output:2") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val o = NodeDef { - name = "o" - Input("output:3") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val ci = NodeDef { - name = "ci" - Input("output:4") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val h = NodeDef { - name = "h" - Input("output:5") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val graphDef = GraphDef { - Node(x) - Node(csPrev) - Node(hPrev) - Node(w) - Node(wci) - Node(wcf) - Node(wco) - Node(bias) - Node(opNode) - Node(i) - Node(cs) - Node(f) - Node(o) - Node(ci) - Node(h) - } - - - - - val xVal = Nd4j.linspace(1,5,5).reshape(1,5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val csPrevVal = Nd4j.linspace(1,3,3).reshape(1,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val hPrevVal = Nd4j.linspace(1,3,3).reshape(1,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wVal = Nd4j.linspace(1,96,96).reshape(8,12) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wciVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wcfVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val wcoVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val bVal = Nd4j.zeros(12) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - - - val inputs = mapOf("x" to xVal,"cs_prev" to csPrevVal,"h_prev" to hPrevVal,"w" to wVal,"wci" to wciVal,"wcf" to wcfVal,"wco" to wcoVal,"b" to bVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("x","cs_prev","h_prev","w","wci","wcf","wco","b"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "lstmBlock" -> { - if(inputFrameworkOpName == "BlockLSTM") { - val seqLenMax = NodeDef { - name = "seq_len_max" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val x = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val csPrev = NodeDef { - name = "cs_prev" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val hPrev = NodeDef { - name = "h_prev" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val w = NodeDef { - name = "w" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val wci = NodeDef { - name = "wci" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val wcf = NodeDef { - name = "wcf" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val wco = NodeDef { - name = "wco" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val bias = NodeDef { - name = "b" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("seq_len_max") - Input("x") - Input("cs_prev") - Input("h_prev") - Input("w") - Input("wci") - Input("wcf") - Input("wco") - Input("b") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("forget_bias",AttrValue { - f = 2.0f - }) - Attribute("forget_bias",AttrValue { - f = 3.0f - }) - Attribute("use_peephole",AttrValue { - b = false - }) - } - - - val i = NodeDef { - name = "i" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val cs = NodeDef { - name = "cs" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val f = NodeDef { - name = "f" - Input("output:2") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val o = NodeDef { - name = "o" - Input("output:3") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val ci = NodeDef { - name = "ci" - Input("output:4") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val h = NodeDef { - name = "h" - Input("output:5") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val graphDef = GraphDef { - Node(seqLenMax) - Node(x) - Node(csPrev) - Node(hPrev) - Node(w) - Node(wci) - Node(wcf) - Node(wco) - Node(bias) - Node(opNode) - Node(i) - Node(cs) - Node(f) - Node(o) - Node(ci) - Node(h) - } - - - - val seqLenVal = Nd4j.scalar(5.0) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - val xVal = Nd4j.linspace(1,20,20).reshape(5,1,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val csPrevVal = Nd4j.linspace(1,3,3).reshape(1,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val hPrevVal = Nd4j.linspace(1,3,3).reshape(1,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wVal = Nd4j.linspace(1,84,84).reshape(7,12) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wciVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wcfVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val wcoVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val bVal = Nd4j.zeros(12) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - - - val inputs = mapOf("seq_len_max" to seqLenVal,"x" to xVal,"cs_prev" to csPrevVal,"h_prev" to hPrevVal,"w" to wVal,"wci" to wciVal,"wcf" to wcfVal,"wco" to wcoVal,"b" to bVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("seq_len_max","x","cs_prev","h_prev","w","wci","wcf","wco","b"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } else { //BlockLSTMV2 - val seqLenMax = NodeDef { - name = "seq_len_max" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val x = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val csPrev = NodeDef { - name = "cs_prev" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val hPrev = NodeDef { - name = "h_prev" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val w = NodeDef { - name = "w" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val wci = NodeDef { - name = "wci" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val wcf = NodeDef { - name = "wcf" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val wco = NodeDef { - name = "wco" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val bias = NodeDef { - name = "b" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("seq_len_max") - Input("x") - Input("cs_prev") - Input("h_prev") - Input("w") - Input("wci") - Input("wcf") - Input("wco") - Input("b") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - - Attribute("use_peephole",AttrValue { - b = false - }) - } - - - val i = NodeDef { - name = "i" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val cs = NodeDef { - name = "cs" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val f = NodeDef { - name = "f" - Input("output:2") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val o = NodeDef { - name = "o" - Input("output:3") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val ci = NodeDef { - name = "ci" - Input("output:4") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val h = NodeDef { - name = "h" - Input("output:5") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val graphDef = GraphDef { - Node(seqLenMax) - Node(x) - Node(csPrev) - Node(hPrev) - Node(w) - Node(wci) - Node(wcf) - Node(wco) - Node(bias) - Node(opNode) - Node(i) - Node(cs) - Node(f) - Node(o) - Node(ci) - Node(h) - } - - - - val seqLenVal = Nd4j.scalar(5.0) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - val xVal = Nd4j.linspace(1,20,20).reshape(5,1,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val csPrevVal = Nd4j.linspace(1,3,3).reshape(1,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val hPrevVal = Nd4j.linspace(1,3,3).reshape(1,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wVal = Nd4j.linspace(1,84,84).reshape(7,12) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wciVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val wcfVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val wcoVal = Nd4j.linspace(1,3,3).reshape(3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val bVal = Nd4j.zeros(12) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - - - val inputs = mapOf("seq_len_max" to seqLenVal,"x" to xVal,"cs_prev" to csPrevVal,"h_prev" to hPrevVal,"w" to wVal,"wci" to wciVal,"wcf" to wcfVal,"wco" to wcoVal,"b" to bVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("seq_len_max","x","cs_prev","h_prev","w","wci","wcf","wco","b"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - } - - - - "adjust_hue","adjust_saturation" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val delta = NodeDef { - name = "delta" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("delta") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val graphDef = GraphDef { - Node(input) - Node(delta) - Node(opNode) - } - - - - val xVal = Nd4j.zeros(2,2,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val deltaVal = Nd4j.scalar(0.5).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val inputs = mapOf("input" to xVal,"delta" to deltaVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","delta"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "rgb_to_hsv" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - - - val xVal = Nd4j.zeros(3,3,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val inputs = mapOf("input" to xVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "reverse_sequence" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val seqLengths = NodeDef { - name = "seq_lengths" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("seq_lengths") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("Tlen",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("seq_dim",AttrValue { - i = 2 - }) - Attribute("batch_dim",AttrValue { - i = 1 - }) - } - - val graphDef = GraphDef { - Node(input) - Node(seqLengths) - Node(opNode) - } - - - - val xVal = Nd4j.linspace(1,60,60).reshape(3,4,5) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val yVal = Nd4j.create(floatArrayOf(4f,4f,4f,4f)) - .reshape(4) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - - val inputs = mapOf("input" to xVal,"seq_lengths" to yVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","seq_lengths"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - "resize_nearest_neighbor" -> { - val images = NodeDef { - name = "images" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val size = NodeDef { - name = "size" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("images") - Input("size") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(images) - Node(size) - Node(opNode) - } - - - - val xVal = Nd4j.linspace(1,36,36).reshape(1,3,3,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val yVal = Nd4j.create(floatArrayOf(6f,6f)) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - - val inputs = mapOf("images" to xVal,"size" to yVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("images","size"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - "resize_bilinear" -> { - val images = NodeDef { - name = "images" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val size = NodeDef { - name = "size" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("images") - Input("size") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(images) - Node(size) - Node(opNode) - } - - - - val xVal = Nd4j.linspace(1,36,36).reshape(1,3,3,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val yVal = Nd4j.create(floatArrayOf(6f,6f)) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - - val inputs = mapOf("images" to xVal,"size" to yVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("images","size"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "resize_bicubic" -> { - val images = NodeDef { - name = "images" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val size = NodeDef { - name = "size" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("images") - Input("size") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(images) - Node(size) - Node(opNode) - } - - - - val xVal = Nd4j.linspace(1,36,36).reshape(1,3,3,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val yVal = Nd4j.create(floatArrayOf(6f,6f)) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - - val inputs = mapOf("images" to xVal,"size" to yVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("images","size"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "resize_area" -> { - val images = NodeDef { - name = "images" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val size = NodeDef { - name = "size" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("images") - Input("size") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(images) - Node(size) - Node(opNode) - } - - - - val xVal = Nd4j.linspace(1,36,36).reshape(1,3,3,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val yVal = Nd4j.create(floatArrayOf(6f,6f)) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - - val inputs = mapOf("images" to xVal,"size" to yVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("images","size"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "mirror_pad" -> { - val mirrorPadRet = ArrayList() - listOf("REFLECT","SYMMETRIC").forEach { mode -> - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val paddings = NodeDef { - name = "paddings" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("paddings") - op = tensorflowOpDef.name - name = "output" - Attribute("mode",AttrValue { - s = ByteString.copyFrom(mode.toByteArray(Charset.defaultCharset())) - }) - Attribute("Tpaddings",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val graphDef = GraphDef { - Node(input) - Node(paddings) - Node(opNode) - } - - - - val xVal = Nd4j.linspace(1,5,5).reshape(5) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val yVal = Nd4j.create(floatArrayOf(1f,1f)) - .reshape(1,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - - val inputs = mapOf("input" to xVal,"paddings" to yVal) - - - mirrorPadRet.add(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","paddings"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - return mirrorPadRet - } - - "listdiff" -> { - val x = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val y = NodeDef { - name = "y" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("x") - Input("y") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(x) - Node(y) - Node(opNode) - } - - - - val xVal = Nd4j.linspace(1,4,4).reshape(4) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val yVal = Nd4j.create(floatArrayOf(3f,1f)) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - - val inputs = mapOf("x" to xVal,"y" to yVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("x","y"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - "histogram_fixed_width" -> { - val values = NodeDef { - name = "values" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val valueRange = NodeDef { - name = "value_range" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val nBins = NodeDef { - name = "nbins" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("values") - Input("value_range") - Input("nbins") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(values) - Node(valueRange) - Node(nBins) - Node(opNode) - } - - - - val valuesVal = Nd4j.ones(2,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val valueRangeVal = Nd4j.create(floatArrayOf(0f,5f)) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val nbinsVal = Nd4j.scalar(5f) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("values" to valuesVal,"value_range" to valueRangeVal,"nbins" to nbinsVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("values","value_range","nbins"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - "extract_image_patches" -> { - val images = NodeDef { - name = "images" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - val opNode = NodeDef { - Input("images") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("ksizes",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("strides",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("rates",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("padding",AttrValue { - s = ByteString.copyFrom("SAME".toByteArray(Charset.defaultCharset())) - }) - } - val graphDef = GraphDef { - Node(images) - Node(opNode) - } - - //1,2,5,4 - - //3,2,2,2 - - - val imagesVal = Nd4j.ones(2,4,3,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - - val inputs = mapOf("images" to imagesVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("images"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "crop_and_resize" -> { - val images = NodeDef { - name = "images" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val boxes = NodeDef { - name = "boxes" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val boxesI = NodeDef { - name = "boxesI" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val cropSize = NodeDef { - name = "cropSize" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("images") - Input("boxes") - Input("boxesI") - Input("cropSize") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val graphDef = GraphDef { - Node(images) - Node(boxes) - Node(boxesI) - Node(cropSize) - Node(opNode) - } - - - - val imagesVal = Nd4j.create(floatArrayOf(1f,2f,3f,4f)) - .reshape(1,2,2,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val boxesVal = Nd4j.create(floatArrayOf(0f,0f,1f,1f)) - .reshape(1,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val boxesIVal = Nd4j.create(floatArrayOf(0f)) - .reshape(1) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val cropSizeVal = Nd4j.create(floatArrayOf(1f,1f)) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val inputs = mapOf("images" to imagesVal,"boxes" to boxesVal,"boxesI" to boxesIVal,"cropSize" to cropSizeVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("images","boxes","boxesI","cropSize"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "broadcastgradientargs" -> { - val s0 = NodeDef { - name = "s0" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val s1 = NodeDef { - name = "s1" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("s0") - Input("s1") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(s0) - Node(s1) - Node(opNode) - } - - - - val s0Val = Nd4j.create(floatArrayOf(2f,2f,2f)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val s1Val = Nd4j.create(floatArrayOf(2f,1f,2f)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val inputs = mapOf("s0" to s0Val,"s1" to s1Val) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("s0","s1"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "broadcast_dynamic_shape" -> { - val s0 = NodeDef { - name = "s0" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val s1 = NodeDef { - name = "s1" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("s0") - Input("s1") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(s0) - Node(s1) - Node(opNode) - } - - - - val s0Val = Nd4j.create(floatArrayOf(2f,2f,2f)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val s1Val = Nd4j.create(floatArrayOf(2f,1f,2f)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val inputs = mapOf("s0" to s0Val,"s1" to s1Val) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("s0","s1"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "lrn" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("depth_radius",AttrValue { - i = 5 - }) - Attribute("bias",AttrValue { - f = 1f - }) - Attribute("alpha",AttrValue { - f = 0.5f - }) - Attribute("beta",AttrValue { - f = 0.5f - }) - } - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - //1,2,5,4 - - //3,2,2,2 - - //1, 1,2,2,1, 1,2,2,1 - - val inputVal = Nd4j.linspace(1,16,16).reshape(2,2,2,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val inputs = mapOf("input" to inputVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "fused_batch_norm" -> { - val x = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val scale = NodeDef { - name = "scale" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val offset = NodeDef { - name = "offset" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val mean = NodeDef { - name = "mean" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val variance = NodeDef { - name = "variance" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - - val epsilon = 0.0001f - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("x") - Input("scale") - Input("offset") - Input("mean") - Input("variance") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("U",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("is_training",AttrValue { - b = false - }) - Attribute("data_format",AttrValue { - s = ByteString.copyFrom("NHWC".toByteArray(Charset.defaultCharset())) - }) - Attribute("epsilon",AttrValue { - f = epsilon - }) - } - - - val y = NodeDef { - name = "y" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val batchMean = NodeDef { - name = "batch_mean" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val batchVariance = NodeDef { - name = "batch_variance" - Input("output:2") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val graphDef = GraphDef { - Node(x) - Node(scale) - Node(mean) - Node(offset) - Node(variance) - Node(opNode) - Node(y) - Node(batchMean) - Node(batchVariance) - } - - - - val xVal = Nd4j.ones(2,2,2,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val scaleVal = Nd4j.zeros(2).addi(0.5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val offsetVal = Nd4j.zeros(2).addi(2).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - // xAffected *= (*variance + epsilon).transform(transform::RSqrt) * (*scale) + (*offset); - val testResult = Nd4j.ones(8,2).muli(Nd4j.exec(RSqrt(Nd4j.scalar(epsilon)))).muli(scaleVal).addi(offsetVal) - val meanVal = Nd4j.zeros(2) - val varianceVal = Nd4j.zeros(2) - val otherResult = xVal.sub(meanVal).div(varianceVal.add(epsilon)).mul(scaleVal).add(offsetVal) - // (batch - self.moving_mean) / (self.moving_var + epsilon) * gamma + beta. - - val inputs = mapOf("x" to xVal,"scale" to scaleVal,"mean" to meanVal,"offset" to offsetVal,"variance" to varianceVal) - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("x","scale","offset","mean","variance"), - outputNames = listOf("y","batch_mean","batch_variance"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - "conv3dnew" -> { - // int bS=2, iD=3,iH=4,iW=3, iC=4,oC=3, kD=2,kH=3,kW=2, sD=1,sH=1,sW=1, pD=0,pH=0,pW=0, dD=1,dH=1,dW=1; - // int paddingMode = 1; // 1-SAME, 0-VALID; - //int dataFormat = 1; // 1-NDHWC, 0-NCDHW - //2,3,4,3,4 - //2,3,2,4,3 - //auto input = NDArrayFactory::create('c', {bS, iD, iH, iW, iC}); - //auto weights = NDArrayFactory::create('c', {kD, kH, kW, iC, oC}); -//, {kD,kH,kW, sD,sH,sW, pD,pH,pW, dD,dH,dW, paddingMode, 1, dataFormat} - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val filter = NodeDef { - name = "filter" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - //, {kD,kH,kW, sD,sH,sW, pD,pH,pW, dD,dH,dW, paddingMode, 1, dataFormat} - - val opNode = NodeDef { - Input("input") - Input("filter") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("strides",AttrValue { - ListInts(listOf(1,1,1,1,1)) - }) - Attribute("padding",AttrValue { - s = ByteString.copyFrom("SAME".toByteArray(Charset.defaultCharset())) - }) - Attribute("data_format",AttrValue { - s = ByteString.copyFrom("NDHWC".toByteArray(Charset.defaultCharset())) - }) - } - val graphDef = GraphDef { - Node(input) - Node(filter) - Node(opNode) - } - - //1,2,5,4 - - //3,2,2,2 - - - val inputVal = Nd4j.ones(2,3,4,3,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val filterVal = Nd4j.ones(2,3,2,4,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("input" to inputVal,"filter" to filterVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","filter"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "avgpool3dnew","maxpool3dnew" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("ksize",AttrValue { - ListInts(listOf(1,1,1,1,1)) - }) - Attribute("strides",AttrValue { - ListInts(listOf(1,1,1,1,1)) - }) - Attribute("padding",AttrValue { - s = ByteString.copyFrom("SAME".toByteArray(Charset.defaultCharset())) - }) - Attribute("data_format",AttrValue { - s = ByteString.copyFrom("NDHWC".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - //2,3,3,43 - val inputVal = Nd4j.ones(2,3,3,4,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val inputs = mapOf("input" to inputVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "draw_bounding_boxes" -> { - val images = NodeDef { - name = "images" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val boxes = NodeDef { - name = "boxes" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val colors = NodeDef { - name = "colors" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("images") - Input("boxes") - Input("colors") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val graphDef = GraphDef { - Node(images) - Node(boxes) - Node(colors) - Node(opNode) - } - - - - val imagesVal = Nd4j.linspace(1,120,120).reshape(2,4,5,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val boxesVal = Nd4j.linspace(1,16,16).reshape(2,2,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - val colorVal = Nd4j.create(floatArrayOf(201f, 202f, 203f, 127f, 128f, 129f)).reshape(2,3) - - val inputs = mapOf("images" to imagesVal,"boxes" to boxesVal,"colors" to colorVal) - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("images","boxes","colors"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - "create" -> { - val shape = NodeDef { - name = "shape" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("shape") - op = tensorflowOpDef.name - name = "output" - Attribute("init",AttrValue { - b = true - }) - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val graphDef = GraphDef { - Node(shape) - Node(opNode) - } - - - - val shapeVal = Nd4j.create(doubleArrayOf(1.0,2.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val inputs = mapOf("shape" to shapeVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("shape"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - "select" -> { - val condition = NodeDef { - name = "condition" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - val t = NodeDef { - name = "t" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val e = NodeDef { - name = "e" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("condition") - Input("t") - Input("e") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - val graphDef = GraphDef { - Node(condition) - Node(t) - Node(e) - Node(opNode) - } - - - - val conditionVal = Nd4j.create(booleanArrayOf(true,false,false)) - .castTo(org.nd4j.linalg.api.buffer.DataType.BOOL) - - val tVal = Nd4j.linspace(1,9,9).reshape(3,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val eVal = Nd4j.create(doubleArrayOf(9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0)) - .reshape(3,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("condition" to conditionVal,"t" to tVal,"e" to eVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("condition","t","e"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - "compare_and_bitpack" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val threshold = NodeDef { - name = "threshold" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("threshold") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - val graphDef = GraphDef { - Node(input) - Node(threshold) - Node(opNode) - } - - - - val inputVal = Nd4j.create(floatArrayOf(-12f, -11f, -10f, -9f, -8f, -7f, -6f, -5f, -4f, -3f, -2f, -1f, 0f, 1f, 2f, 3f, 4f, 5f, 6f, 7f, 8f, 9f, 10f, 11f)).reshape(2,3,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val thresholdVal = Nd4j.scalar(2.0) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("input" to inputVal,"threshold" to thresholdVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","threshold"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "strided_slice" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val begin = NodeDef { - name = "begin" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val end = NodeDef { - name = "end" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val strides = NodeDef { - name = "strides" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("begin") - Input("end") - Input("strides") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Index",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("shrink_axis_mask",AttrValue { - i = 1 - }) - } - val graphDef = GraphDef { - Node(input) - Node(begin) - Node(end) - Node(strides) - Node(opNode) - } - - - - val inputVal = Nd4j.linspace(1,10,10).reshape(5,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val beginVal = Nd4j.create(doubleArrayOf(0.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val endVal = Nd4j.create(doubleArrayOf(1.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val strideVal = Nd4j.create(doubleArrayOf(1.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val inputs = mapOf("input" to inputVal,"begin" to beginVal, "end" to endVal,"strides" to strideVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","begin","end","strides"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - "bincount" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val size = NodeDef { - name = "size" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val weights = NodeDef { - name = "weights" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("size") - Input("weights") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - val graphDef = GraphDef { - Node(input) - Node(size) - Node(weights) - Node(opNode) - } - - - - val inputVal = Nd4j.create(doubleArrayOf(1.0, 2.0, 0.0, 1.0, 2.0, 2.0, 1.0, 2.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val sizeVal = Nd4j.create(doubleArrayOf(3.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val weightVal = Nd4j.create(doubleArrayOf(1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("input" to inputVal,"size" to sizeVal, "weights" to weightVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","size","weights"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "broadcast_to" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val shape = NodeDef { - name = "shape" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("shape") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tidx",AttrValue { - type = DataType.DT_INT64 - }) - } - val graphDef = GraphDef { - Node(input) - Node(shape) - Node(opNode) - } - - - - val inputVal = Nd4j.create(doubleArrayOf(2.0)) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val shapeVal = Nd4j.zeros(2).addi(4) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - val inputs = mapOf("input" to inputVal,"shape" to shapeVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","shape"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "condition" -> { - val condition = NodeDef { - name = "condition" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - val t = NodeDef { - name = "t" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val e = NodeDef { - name = "e" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("condition") - Input("t") - Input("e") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - val graphDef = GraphDef { - Node(condition) - Node(t) - Node(e) - Node(opNode) - } - - - - val conditionVal = Nd4j.create(booleanArrayOf(true,true,false,false)).reshape(2,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.BOOL) - - val tVal = Nd4j.linspace(1,4,4).reshape(2,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val eVal = Nd4j.linspace(1,4,4).reshape(2,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("condition" to conditionVal,"t" to tVal,"e" to eVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("condition","t","e"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "biasadd" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val bias = NodeDef { - name = "bias" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("bias") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - val graphDef = GraphDef { - Node(input) - Node(bias) - Node(opNode) - } - - - - val inputVal = Nd4j.linspace(1,2 * 3 * 3 * 2,2 * 3 * 3 * 2).reshape(2,3,3,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val biasVal = Nd4j.linspace(1,2,2).reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("input" to inputVal,"bias" to biasVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","bias"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - "dilation2d" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val filter = NodeDef { - name = "filter" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - val opNode = NodeDef { - Input("input") - Input("filter") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("strides",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("rates",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("padding",AttrValue { - s = ByteString.copyFrom("SAME".toByteArray(Charset.defaultCharset())) - }) - } - val graphDef = GraphDef { - Node(input) - Node(filter) - Node(opNode) - } - - //1,2,5,4 - - //3,2,2,2 - - //1, 1,2,2,1, 1,2,2,1 - - val inputVal = Nd4j.linspace(1,2 * 6 * 6 * 3,2 * 6 * 6 * 3).reshape(2,6,6,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val filterVal = Nd4j.linspace(1,3 * 2 * 3,3 * 2 * 3).reshape(3,2,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("input" to inputVal,"filter" to filterVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","filter"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "depthwise_conv2d" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val filter = NodeDef { - name = "filter" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - val opNode = NodeDef { - Input("input") - Input("filter") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("strides",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("padding",AttrValue { - s = ByteString.copyFrom("SAME".toByteArray(Charset.defaultCharset())) - }) - Attribute("data_format",AttrValue { - s = ByteString.copyFrom("NHWC".toByteArray(Charset.defaultCharset())) - }) - } - val graphDef = GraphDef { - Node(input) - Node(filter) - Node(opNode) - } - - //1,2,5,4 - - //3,2,2,2 - - - val inputVal = Nd4j.ones(2,4,3,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val filterVal = Nd4j.ones(3,2,2,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("input" to inputVal,"filter" to filterVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","filter"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "conv2d" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val filter = NodeDef { - name = "filter" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - val opNode = NodeDef { - Input("input") - Input("filter") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("strides",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("padding",AttrValue { - s = ByteString.copyFrom("SAME".toByteArray(Charset.defaultCharset())) - }) - Attribute("data_format",AttrValue { - s = ByteString.copyFrom("NHWC".toByteArray(Charset.defaultCharset())) - }) - } - val graphDef = GraphDef { - Node(input) - Node(filter) - Node(opNode) - } - - //1,2,5,4 - - //3,2,2,2 - - - val inputVal = Nd4j.ones(1,4,1,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val filterVal = Nd4j.ones(1,1,1,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("input" to inputVal,"filter" to filterVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","filter"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "avgpool2d","maxpool2d" -> { - if(tensorflowOpDef.name == "AvgPool" || tensorflowOpDef.name == "MaxPool") { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("ksize",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("strides",AttrValue { - ListInts(listOf(1,1,1,1)) - }) - Attribute("padding",AttrValue { - s = ByteString.copyFrom("SAME".toByteArray(Charset.defaultCharset())) - }) - Attribute("data_format",AttrValue { - s = ByteString.copyFrom("NHWC".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - val inputVal = Nd4j.ones(2,4,4,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - val inputs = mapOf("input" to inputVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } else { //MaxPoolV2 - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val ksize = NodeDef { - name = "ksize" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val stride = NodeDef { - name = "stride" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - // {2, 2, 2, 2, 0, 0, 1, 1, 1, 1, 1} - val opNode = NodeDef { - Input("input") - Input("ksize") - Input("stride") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - - Attribute("padding",AttrValue { - s = ByteString.copyFrom("SAME".toByteArray(Charset.defaultCharset())) - }) - Attribute("data_format",AttrValue { - s = ByteString.copyFrom("NHWC".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphDef { - Node(input) - Node(ksize) - Node(stride) - Node(opNode) - } - - val inputVal = Nd4j.ones(2,4,4,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val ksizeVal = Nd4j.create(floatArrayOf(1.0f,2.0f,2.0f,1.0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val strideVal = Nd4j.create(floatArrayOf(1.0f,2.0f,2.0f,1.0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("input" to inputVal,"ksize" to ksizeVal,"stride" to strideVal) - - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("input","ksize","stride"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - } - - - "space_to_batch" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val paddings = NodeDef { - name = "paddings" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("paddings") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tpaddings",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("block_size",AttrValue { - i = 2 - }) - } - - - val graphDef = GraphDef { - Node(input) - Node(paddings) - Node(opNode) - } - - val inputVal = Nd4j.linspace(1,12,12).reshape(1,2,2,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val paddingsVal = Nd4j.zeros(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal,"paddings" to paddingsVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","paddings"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - "batch_to_space_nd" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val blockShape = NodeDef { - name = "block_shape" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("shape",AttrValue { - shape = TensorShapeProto { - Dims(listOf(3)) - } - }) - } - - val crops = NodeDef { - name = "crops" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("block_shape") - Input("crops") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tblock_shape",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("Tcrops",AttrValue { - type = DataType.DT_INT32 - }) - - } - - - val graphDef = GraphDef { - Node(input) - Node(blockShape) - Node(crops) - Node(opNode) - } - - val tVal = Nd4j.linspace(1,24,24).reshape(8,1,1,1,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val blockShapeVal = Nd4j.zeros(3).addi(2).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val cropsVal = Nd4j.zeros(3,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("input" to tVal,"block_shape" to blockShapeVal,"crops" to cropsVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","block_shape","crops"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - "space_to_batch_nd" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val blockShape = NodeDef { - name = "block_shape" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("shape",AttrValue { - shape = TensorShapeProto { - Dims(listOf(3)) - } - }) - } - - val paddings = NodeDef { - name = "paddings" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("block_shape") - Input("paddings") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tblock_shape",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("Tpaddings",AttrValue { - type = DataType.DT_INT32 - }) - - } - - - val graphDef = GraphDef { - Node(input) - Node(blockShape) - Node(paddings) - Node(opNode) - } - - val tVal = Nd4j.linspace(1,48,48).reshape(2,2,4,3,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val blockShapeVal = Nd4j.create(floatArrayOf(2.0f,2.0f,3f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val paddingsVal = Nd4j.create(floatArrayOf(0f,0f,0f,2f,2f,1f)).reshape(3,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("input" to tVal,"block_shape" to blockShapeVal,"paddings" to paddingsVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","block_shape","paddings"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - "batch_to_space" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val crops = NodeDef { - name = "crops" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("crops") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tidx",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("block_size",AttrValue { - i = 2 - }) - } - - - val graphDef = GraphDef { - Node(input) - Node(crops) - Node(opNode) - } - - val tVal = Nd4j.linspace(1,12,12).reshape(4,1,1,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val cropsVal = Nd4j.zeros(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to tVal,"crops" to cropsVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","crops"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - "slice" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val begin = NodeDef { - name = "begin" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val size = NodeDef { - name = "size" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("begin") - Input("size") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Index",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val graphDef = GraphDef { - Node(input) - Node(begin) - Node(size) - Node(opNode) - } - - val tVal = Nd4j.linspace(1,12,12).reshape(3,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val beginVal = Nd4j.create(doubleArrayOf(0.0,1.0)).reshape(2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - val sizeVal = Nd4j.create(doubleArrayOf(0.0,1.0)).reshape(2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to tVal,"begin" to beginVal,"size" to sizeVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","begin","size"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - "ClipByValue" -> { - val t = NodeDef { - name = "t" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val clipValueMin = NodeDef { - name = "clip_value_min" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val clipValueMax = NodeDef { - name = "clip_value_max" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("t") - Input("clip_value_min") - Input("clip_value_max") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val graphDef = GraphDef { - Node(t) - Node(clipValueMin) - Node(clipValueMax) - Node(opNode) - } - - val tVal = Nd4j.linspace(1,12,12).reshape(3,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val clipValueMinVal = Nd4j.scalar(0.0).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val clipValueMaxVal = Nd4j.scalar(1.0).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - val inputs = mapOf("t" to tVal,"clip_value_min" to clipValueMinVal,"clip_value_max" to clipValueMaxVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("t","clip_value_min","clip_value_max"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - - - "squeeze" -> { - val value = NodeDef { - name = "value" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("value") - - op = tensorflowOpDef.name - name = "output" - Attribute("squeeze_dims",AttrValue { - ListInts(listOf(2)) - }) - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val graphDef = GraphDef { - Node(value) - Node(opNode) - } - - val valuesVal = Nd4j.linspace(1,12,12).reshape(3,4,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("value" to valuesVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("value"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - "identity_n" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val input2 = NodeDef { - name = "input2" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("input2") - op = tensorflowOpDef.name - name = "output" - - Attribute("T",AttrValue { - ListDataType(listOf(DataType.DT_INT64,DataType.DT_INT64)) - }) - } - - - val out0 = NodeDef { - name = "out0" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - val out1 = NodeDef { - name = "out1" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val graphDef = GraphDef { - Node(input) - Node(input2) - Node(opNode) - Node(out0) - Node(out1) - } - - - val inputVal = Nd4j.linspace(1,4,4) - .reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal,"input2" to inputVal.dup()) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","input2"), - outputNames = listOf("out0","out1"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - "shapes_of" -> { - val input1 = NodeDef { - name = "input1" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val input2 = NodeDef { - name = "input2" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val opNode = NodeDef { - Input("input1") - Input("input2") - - op = tensorflowOpDef.name - name = "output" - Attribute("N",AttrValue { - i = 2 - }) - - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("out_type",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val out0 = NodeDef { - name = "out0" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - val out1 = NodeDef { - name = "out1" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - - val graphDef = GraphDef { - Node(input1) - Node(input2) - Node(opNode) - Node(out0) - Node(out1) - } - - val input1Val = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - val input2Val = Nd4j.linspace(1,6,6).reshape(2,3).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input1" to input1Val,"input2" to input2Val) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input1","input2"), - outputNames = listOf("out0","out1"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - - "dynamic_stitch" -> { - val indices1 = NodeDef { - name = "indices" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val indices2 = NodeDef { - name = "indices2" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val data0 = NodeDef { - name = "data0" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val data1 = NodeDef { - name = "data1" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("indices") - Input("indices2") - Input("data0") - Input("data1") - - op = tensorflowOpDef.name - name = "output" - Attribute("N",AttrValue { - i = 2 - }) - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - - - val graphDef = GraphDef { - Node(indices1) - Node(indices2) - Node(data0) - Node(data1) - Node(opNode) - } - - val testGraph = GraphRunner.builder().graphBytes(graphDef.toByteArray()).build() - - val indicesVal = Nd4j.create(floatArrayOf(1.0f,3.0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val indices2Val = Nd4j.create(floatArrayOf(5.0f,0.0f,2.0f,4.0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val dataVal = Nd4j.create(floatArrayOf(-1f,-1f)).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val data2Val = Nd4j.create(floatArrayOf(0.1f,5.2f,4.3f,7.4f)).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("indices" to indicesVal,"indices2" to indices2Val,"data0" to dataVal,"data1" to data2Val) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("indices","indices2","data0","data1"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - "dynamic_partition" -> { - val data = NodeDef { - name = "data" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val partitions = NodeDef { - name = "partitions" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("data") - Input("partitions") - - op = tensorflowOpDef.name - name = "output" - Attribute("num_partitions",AttrValue { - i = 2 - }) - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val out0 = NodeDef { - name = "out0" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val out1 = NodeDef { - name = "out1" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - - val graphDef = GraphDef { - Node(data) - Node(partitions) - Node(opNode) - Node(out0) - Node(out1) - } - - val testGraph = GraphRunner.builder().graphBytes(graphDef.toByteArray()).build() - - val partitionsVal = Nd4j.create(floatArrayOf(0f,0f,1f,1f,0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val dataVal = Nd4j.create(floatArrayOf(10f, 20f, 30f, 40f, 50f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - val inputs = mapOf("data" to dataVal,"partitions" to partitionsVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("data","partitions"), - outputNames = listOf("out0","out1"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - "split_v" -> { - val splitDim = NodeDef { - name = "split_dim" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val sizeSplits = NodeDef { - name = "size_splits" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val value = NodeDef { - name = "value" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("value") - Input("size_splits") - Input("split_dim") - - op = tensorflowOpDef.name - name = "output" - Attribute("num_split",AttrValue { - i = 2 - }) - Attribute("Tlen",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val out0 = NodeDef { - name = "out0" - Input("output:0") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - val out1 = NodeDef { - name = "out1" - Input("output:1") - op = "Identity" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val graphDef = GraphDef { - Node(value) - Node(sizeSplits) - Node(splitDim) - Node(opNode) - Node(out0) - Node(out1) - } - - val splitDimVal = Nd4j.scalar(-2.0).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val sizeSplitsVal = Nd4j.create(floatArrayOf(5f,3f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - val valuesVal = Nd4j.linspace(1,56,56) - .reshape(8,7).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("split_dim" to splitDimVal,"value" to valuesVal,"size_splits" to sizeSplitsVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("value","size_splits","split_dim"), - outputNames = listOf("out0","out1"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - "split" -> { - val splitDim = NodeDef { - name = "split_dim" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val value = NodeDef { - name = "value" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("split_dim") - Input("value") - - op = tensorflowOpDef.name - name = "output" - Attribute("num_split",AttrValue { - i = 2 - }) - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val graphDef = GraphDef { - Node(splitDim) - Node(value) - Node(opNode) - } - - val concatDimVal = Nd4j.scalar(0.0).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val valuesVal = Nd4j.create(floatArrayOf(0f,1f,0f,1f)) - .reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("split_dim" to concatDimVal,"value" to valuesVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("split_dim","value"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - - - "matmul" -> { - val mmulInput = ArrayList() - listOf(false,true).forEach { transA -> - listOf(false,true).forEach { transB -> - val a = NodeDef { - name = "a" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val bNode = NodeDef { - name = "b" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("a") - Input("b") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("transpose_a",AttrValue { - b = transA - }) - Attribute("transpose_b",AttrValue { - b = transB - }) - } - - - val graphDef = GraphDef { - Node(a) - Node(bNode) - Node(opNode) - } - - val aVal = Nd4j.linspace(1,4,4).reshape(2,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val bVal = Nd4j.create(floatArrayOf(0f,1f,0f,1f)) - .reshape(2,2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - val inputs = mapOf("a" to aVal,"b" to bVal) - mmulInput.add(GraphInput( - graphDef =graphDef, - inputNames = listOf("a","b"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - } - - - return mmulInput - - - } - - "range" -> { - val start = NodeDef { - name = "start" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_INT32 - }) - } - - val limit = NodeDef { - name = "limit" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_INT32 - }) - } - - - val delta = NodeDef { - name = "delta" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("start") - Input("limit") - Input("delta") - op = tensorflowOpDef.name - name = "output" - Attribute("Tidx", AttrValue { - type = DataType.DT_INT32 - }) - } - - - val graphDef = GraphDef { - Node(start) - Node(limit) - Node(delta) - Node(opNode) - } - - val startVal = Nd4j.scalar(1) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val limitVal = Nd4j.scalar(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val deltaVal = Nd4j.scalar(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("start" to startVal, "limit" to limitVal, "delta" to deltaVal) - - - return listOf( - GraphInput( - graphDef = graphDef, inputNames = listOf("start", "limit", "delta"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - ) - ) - - } - - "lin_space" -> { - val start = NodeDef { - name = "start" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val stop = NodeDef { - name = "stop" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val num = NodeDef { - name = "num" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("start") - Input("stop") - Input("num") - op = tensorflowOpDef.name - name = "output" - Attribute("T", AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tidx", AttrValue { - type = DataType.DT_INT64 - }) - } - - - val graphDef = GraphDef { - Node(start) - Node(stop) - Node(num) - Node(opNode) - } - - val startVal = Nd4j.scalar(1) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val limitVal = Nd4j.scalar(1).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - val deltaVal = Nd4j.scalar(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("start" to startVal,"stop" to - limitVal, "num" to deltaVal) - - - return listOf( - GraphInput( - graphDef = graphDef, - inputNames = listOf("start", "stop","num"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = mapOf("limit" to limitVal) - ) - ) - - } - - "gather","gather_nd" -> { - val params = NodeDef { - name = "params" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val indices = NodeDef { - name = "indices" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("params") - Input("indices") - - op = tensorflowOpDef.name - name = "output" - Attribute("Tparams",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("Tindices",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val graphDef = GraphDef { - Node(params) - Node(indices) - Node(opNode) - } - - val paramsVal = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - val indicesVal = Nd4j.create(floatArrayOf(0f,1f,0f,1f)) - .reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("params" to paramsVal,"indices" to indicesVal.dup()) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("params","indices"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - "stack" -> { - val concat1 = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val concat2 = NodeDef { - name = "input2" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("input2") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("N",AttrValue { - i = 2 - }) - Attribute("axis",AttrValue { - i = 0 - }) - } - - - val graphDef = GraphDef { - Node(concat1) - Node(concat2) - Node(opNode) - } - - val inputVal = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal,"input2" to inputVal.dup()) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","input2"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "unstack" -> { - val concat1 = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("num",AttrValue { - i = 2 - }) - Attribute("axis",AttrValue { - i = 0 - }) - } - - - val graphDef = GraphDef { - Node(concat1) - Node(opNode) - } - - val inputVal = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "mergesum" -> { - val concat1 = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val concat2 = NodeDef { - name = "input2" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("input2") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("N",AttrValue { - i = 2 - }) - } - - - val graphDef = GraphDef { - Node(concat1) - Node(concat2) - Node(opNode) - } - - val inputVal = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal,"input2" to inputVal.dup()) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","input2"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "merge" -> { - val concat1 = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val concat2 = NodeDef { - name = "input2" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("input2") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("N",AttrValue { - i = 2 - }) - } - - - val graphDef = GraphDef { - Node(concat1) - Node(concat2) - Node(opNode) - } - - val inputVal = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal,"input2" to inputVal.dup()) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","input2"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - "concat" -> { - if(inputFrameworkOpName == "Concat") { - val concatDim = NodeDef { - name = "concat_dim" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - dtype = DataType.DT_INT32 - Int32Data(listOf(0)) - Shape(listOf()) - } - }) - } - val concat1 = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val concat2 = NodeDef { - name = "input2" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("concat_dim") - Input("input") - Input("input2") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("N",AttrValue { - i = 2 - }) - } - - - val graphDef = GraphDef { - Node(concatDim) - Node(concat1) - Node(concat2) - Node(opNode) - } - - val inputVal = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal,"input2" to inputVal.dup()) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","input2"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } else { //ConcatV2 - val concatDim = NodeDef { - name = "concat_dim" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - dtype = DataType.DT_INT32 - Int32Data(listOf(0)) - Shape(listOf()) - } - }) - } - val concat1 = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val concat2 = NodeDef { - name = "input2" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - Input("input2") - Input("concat_dim") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("N",AttrValue { - i = 2 - }) - } - - - val graphDef = GraphDef { - Node(concat1) - Node(concat2) - Node(concatDim) - Node(opNode) - } - - val inputVal = Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal,"input2" to inputVal.dup()) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","input2"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - } - - "shape_of" -> { - val tensorNode = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("out_type",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - val inputVal = Nd4j.create(floatArrayOf(1.0f,0.0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - - val inputs = mapOf("input" to inputVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "toggle_bits","invert_permutation" -> { - val tensorNode = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - println("Running test import process for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - val inputVal = Nd4j.create(floatArrayOf(1.0f,0.0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("input" to inputVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "reverse" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - - } - - val axis = NodeDef { - name = "axis" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - - } - - val opNode = NodeDef { - Input("input") - Input("axis") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("Tidx",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val graphDef = GraphDef { - Node(input) - Node(axis) - Node(opNode) - } - - - val inputVal = Nd4j.zeros(2,2).addi(0.5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val axisVal = Nd4j.zeros(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("input" to inputVal,"axis" to axisVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","axis"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "roll" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - - } - - val shift = NodeDef { - name = "shift" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - - } - - val axis = NodeDef { - name = "axis" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - - } - - val opNode = NodeDef { - Input("input") - Input("shift") - Input("axis") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("Tshift",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("Taxis",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val graphDef = GraphDef { - Node(input) - Node(shift) - Node(axis) - Node(opNode) - } - - - val inputVal = Nd4j.zeros(2,2).addi(0.5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val shiftVal = Nd4j.zeros(2).addi(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val axisVal = Nd4j.zeros(2).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("input" to inputVal,"shift" to shiftVal,"axis" to axisVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","shift","axis"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "tile" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - - } - - val multiples = NodeDef { - name = "multiples" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - - } - - val opNode = NodeDef { - Input("input") - Input("multiples") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("Tmultiples",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val graphDef = GraphDef { - Node(input) - Node(multiples) - Node(opNode) - } - - - val inputVal = Nd4j.zeros(2,2).addi(0.5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val multiplesVal = Nd4j.zeros(2).addi(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("input" to inputVal,"multiples" to multiplesVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input","multiples"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "leakyrelu" -> { - val a = NodeDef { - name = "a" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - - } - - val opNode = NodeDef { - Input("a") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("alpha",AttrValue { - f = 0.1f - }) - } - - - - val graphDef = GraphDef { - Node(a) - Node(opNode) - } - - - val aVal = Nd4j.zeros(2,2).addi(0.5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - - val inputs = mapOf("a" to aVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("a"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - "betainc" -> { - val a = NodeDef { - name = "a" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val b = NodeDef { - name = "b" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val x = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val opNode = NodeDef { - Input("a") - Input("b") - Input("x") - - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - - val graphDef = GraphDef { - Node(a) - Node(b) - Node(x) - Node(opNode) - } - - - val aVal = Nd4j.zeros(2,2).addi(0.5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val bVal = Nd4j.zeros(2,2).addi(0.5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val xVal = Nd4j.zeros(2,2).addi(0.5) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val inputs = mapOf("a" to aVal,"b" to bVal,"x" to xVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("a","b","x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - "top_k" -> { - if(tensorflowOpDef.name == "TopK") { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("k",AttrValue { - i = 2 - }) - } - - - - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - - val inputs = mapOf("input" to xVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } else { //TopKV2 - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val k = NodeDef { - name = "k" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(2)) - dtype = DataType.DT_INT32 - - } - }) - } - - val opNode = NodeDef { - Input("input") - Input("k") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - - } - - - - val graphDef = GraphDef { - Node(input) - Node(k) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - - val inputs = mapOf("input" to xVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - } - "enter" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("is_constant",AttrValue { - b = false - }) - Attribute("frame_name",AttrValue { - s = ByteString.copyFrom("hello".toByteArray(Charset.defaultCharset())) - }) - - } - - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 6, 6) - .reshape(2, 3) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val inputs = mapOf("input" to xVal) - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "Assert" -> { - val condition = NodeDef { - name = "condition" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - val input = NodeDef { - name = "input" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("value",AttrValue { - tensor = TensorProto { - FloatData(listOf(0.0f)) - dtype = DataType.DT_FLOAT - - } - }) - } - - - val opNode = NodeDef { - Input("condition") - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - ListDataType(listOf(DataType.DT_FLOAT)) - }) - - } - - val graphDef = GraphDef { - Node(condition) - Node(input) - Node(opNode) - } - - - val xVal = Nd4j.create(listOf(true,true,true,true).toBooleanArray()) - - val inputs = mapOf("condition" to xVal) - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("condition"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - "bitcast" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("type",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - - val xVal = Nd4j.zeros(2,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - - val inputs = mapOf("input" to xVal) - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "exit" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - - } - - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 6, 6) - .reshape(2, 3) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val inputs = mapOf("input" to xVal) - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "expand_dims" -> { - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val n = NodeDef { - name = "dimension" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(0)) - dtype = DataType.DT_INT32 - - } - }) - } - - val opNode = NodeDef { - Input("input") - Input("dimension") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - - } - - val graphDef = GraphDef { - Node(input) - Node(n) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 6, 6) - .reshape(2, 3) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val inputs = mapOf("input" to xVal) - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "non_max_suppression","non_max_suppression_v3" -> { - if(inputFrameworkOpName == "NonMaxSuppression") { - val overlaps = NodeDef { - name = "overlaps" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val scores = NodeDef { - name = "scores" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val maxOutputSize = NodeDef { - name = "maxOutputSize" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(1)) - dtype = DataType.DT_INT32 - - } - }) - } - - - - val opNode = NodeDef { - Input("overlaps") - Input("scores") - Input("maxOutputSize") - op = tensorflowOpDef.name - name = "output" - Attribute("iou_threshold",AttrValue { - f = 0.5f - }) - } - - val graphDef = GraphDef { - Node(overlaps) - Node(scores) - Node(maxOutputSize) - Node(opNode) - } - - - - val overlapsVal = Nd4j.create(arrayOf( - floatArrayOf(0f,0f,1f,1f), - floatArrayOf(0f,0.1f,1f,1.1f), - floatArrayOf(0f,-0.1f,1f,0.9f), - floatArrayOf(0f,10f,1f,11f) - )).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val scoresVal = Nd4j.create(listOf(0.9f,0.75f,0.6f,0.95f).toFloatArray()) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val inputs = mapOf("overlaps" to overlapsVal,"scores" to scoresVal) - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("overlaps","scores"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - else if(inputFrameworkOpName == "NonMaxSuppressionV2") { - val overlaps = NodeDef { - name = "overlaps" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val scores = NodeDef { - name = "scores" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val maxOutputSize = NodeDef { - name = "maxOutputSize" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(1)) - dtype = DataType.DT_INT32 - - } - }) - } - - val iouThreshold = NodeDef { - name = "iouThreshold" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("value",AttrValue { - tensor = TensorProto { - FloatData(listOf(0.5f)) - dtype = DataType.DT_FLOAT - - } - }) - } - - - - val opNode = NodeDef { - Input("overlaps") - Input("scores") - Input("maxOutputSize") - Input("iouThreshold") - op = tensorflowOpDef.name - name = "output" - - } - - val graphDef = GraphDef { - Node(overlaps) - Node(scores) - Node(iouThreshold) - Node(maxOutputSize) - Node(opNode) - } - - - - val overlapsVal = Nd4j.create(arrayOf( - floatArrayOf(0f,0f,1f,1f), - floatArrayOf(0f,0.1f,1f,1.1f), - floatArrayOf(0f,-0.1f,1f,0.9f), - floatArrayOf(0f,10f,1f,11f) - )).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val scoresVal = Nd4j.create(listOf(0.9f,0.75f,0.6f,0.95f).toFloatArray()) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val inputs = mapOf("overlaps" to overlapsVal,"scores" to scoresVal) - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("overlaps","scores"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } else { - //V3 and later - val overlaps = NodeDef { - name = "overlaps" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val scores = NodeDef { - name = "scores" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val maxOutputSize = NodeDef { - name = "maxOutputSize" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(1)) - dtype = DataType.DT_INT32 - - } - }) - } - - val overlapThreshold = NodeDef { - name = "iouThreshold" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("value",AttrValue { - tensor = TensorProto { - FloatData(listOf(0.5f)) - dtype = DataType.DT_FLOAT - - } - }) - } - - val scoreThreshold = NodeDef { - name = "scoreThreshold" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("value",AttrValue { - tensor = TensorProto { - FloatData(listOf(0.5f)) - dtype = DataType.DT_FLOAT - - } - }) - } - - val opNode = NodeDef { - Input("overlaps") - Input("scores") - Input("maxOutputSize") - Input("iouThreshold") - Input("scoreThreshold") - op = tensorflowOpDef.name - name = "output" - - } - - val graphDef = GraphDef { - Node(overlaps) - Node(scores) - Node(scoreThreshold) - Node(overlapThreshold) - Node(maxOutputSize) - Node(opNode) - } - - - - val overlapsVal = Nd4j.create(arrayOf( - floatArrayOf(0f,0f,1f,1f), - floatArrayOf(0f,0.1f,1f,1.1f), - floatArrayOf(0f,-0.1f,1f,0.9f), - floatArrayOf(0f,10f,1f,11f) - )).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val scoresVal = Nd4j.create(listOf(0.9f,0.75f,0.6f,0.95f).toFloatArray()) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val inputs = mapOf("overlaps" to overlapsVal,"scores" to scoresVal) - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("overlaps","scores"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - } - - "non_max_suppression_overlaps" -> { - val overlaps = NodeDef { - name = "overlaps" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val scores = NodeDef { - name = "scores" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val maxOutputSize = NodeDef { - name = "maxOutputSize" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(1)) - dtype = DataType.DT_INT32 - - } - }) - } - - val overlapThreshold = NodeDef { - name = "overlapThreshold" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("value",AttrValue { - tensor = TensorProto { - FloatData(listOf(2.0f)) - dtype = DataType.DT_FLOAT - - } - }) - } - - val scoreThreshold = NodeDef { - name = "scoreThreshold" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - Attribute("value",AttrValue { - tensor = TensorProto { - FloatData(listOf(0.5f)) - dtype = DataType.DT_FLOAT - - } - }) - } - - val opNode = NodeDef { - Input("overlaps") - Input("scores") - Input("maxOutputSize") - Input("overlapThreshold") - Input("scoreThreshold") - op = tensorflowOpDef.name - name = "output" - - } - - val graphDef = GraphDef { - Node(overlaps) - Node(scores) - Node(scoreThreshold) - Node(overlapThreshold) - Node(maxOutputSize) - Node(opNode) - } - - - - val overlapsVal = Nd4j.create(arrayOf( - floatArrayOf(0f,0f,1f,1f), - floatArrayOf(0f,0.1f,1f,1.1f), - floatArrayOf(0f,-0.1f,1f,0.9f), - floatArrayOf(0f,10f,1f,11f) - )).castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val scoresVal = Nd4j.create(listOf(0.9f,0.75f,0.6f,0.95f).toFloatArray()) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val inputs = mapOf("overlaps" to overlapsVal,"scores" to scoresVal) - - return listOf(GraphInput( - graphDef = graphDef, - inputNames = listOf("overlaps","scores"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - - )) - } - - "nth_element" -> { - val ret = ArrayList() - listOf(true,false).forEach { reverse -> - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val n = NodeDef { - name = "n" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(2)) - dtype = DataType.DT_INT32 - - } - }) - } - - val opNode = NodeDef { - Input("input") - Input("n") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - - Attribute("reverse",AttrValue { - type = DataType.DT_BOOL - b = reverse - }) - - } - - val graphDef = GraphDef { - Node(input) - Node(n) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 6, 6) - .reshape(2, 3) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val inputs = mapOf("input" to xVal) - - ret.add(GraphInput( - graphDef =graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - return ret - } - - - "cholesky" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - - } - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - - val xVal = Nd4j.create(floatArrayOf(4f,12f,-16f, 12f ,37f,-43f, -16f, -43f, 98f)) - .reshape(3,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - "matrix_diag_part" -> { - val retSolve = ArrayList() - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - - val opNode = NodeDef { - Input("input") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - - - } - - val graphDef = GraphDef { - Node(input) - Node(opNode) - } - - - val inputVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - val inputs = mapOf("input" to inputVal) - - - retSolve.add(GraphInput( - graphDef = graphDef, inputNames = listOf("input"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - - return retSolve - - } - - - "matrix_set_diag","matrix_diag_part" -> { - val retSolve = ArrayList() - val input = NodeDef { - name = "input" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val diagonal = NodeDef { - name = "diagonal" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - val opNode = NodeDef { - Input("input") - Input("diagonal") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - - - } - - val graphDef = GraphDef { - Node(input) - Node(diagonal) - Node(opNode) - } - - - val inputVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val diagonalVal = Nd4j.zeros(2).addi(1) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("input" to inputVal,"diagonal" to diagonalVal) - - - retSolve.add(GraphInput( - graphDef = graphDef, inputNames = listOf("input","diagonal"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - - return retSolve - - } - - "solve","triangular_solve" -> { - val retSolve = ArrayList() - listOf(false,true).forEach { useAdjoint -> - val a = NodeDef { - name = "a" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val bNode = NodeDef { - name = "b" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - val opNode = NodeDef { - Input("a") - Input("b") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("adjoint",AttrValue { - b = useAdjoint - }) - - } - - val graphDef = GraphDef { - Node(a) - Node(bNode) - Node(opNode) - } - - - val aVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val bVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("a" to aVal,"b" to bVal) - - - retSolve.add(GraphInput( - graphDef = graphDef, inputNames = listOf("a","b"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - return retSolve - - } - - "matrix_determinant","log_matrix_determinant" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - - } - - val finalResult = NodeDef { - Input("output:1") - op = "Identity" - name = "finalResult" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - - } - - if(nd4jOpName == "log_matrix_determinant") { - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - Node(finalResult) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - - return listOf(GraphInput( - graphDef = graphDef, inputNames = listOf("x"), - outputNames = listOf("finalResult"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } else { - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - } - - - "lu" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - - } - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "matrix_inverse" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - - } - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "in_top_k" -> { - if(tensorflowOpDef.name == "InTopK") { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val predictions = NodeDef { - name = "predictions" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val opNode = NodeDef { - Input("x") - Input("predictions") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("k",AttrValue { - i = 2 - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(predictions) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val predictionsArr = Nd4j.linspace(1, 2, 2) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("x" to xVal,"predictions" to predictionsArr) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("x","predictions"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } else { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val predictions = NodeDef { - name = "predictions" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val k = NodeDef { - name = "k" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(2)) - dtype = DataType.DT_INT32 - - } - }) - } - - val opNode = NodeDef { - Input("x") - Input("predictions") - Input("k") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(predictions) - Node(k) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val predictionsArr = Nd4j.linspace(1, 2, 2) - .reshape(2) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("x" to xVal,"predictions" to predictionsArr) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("x","predictions"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - } - - - "onehot" -> { - val indices = NodeDef { - name = "indices" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - val depth = NodeDef { - name = "depth" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - dtype = DataType.DT_INT32 - Int32Data(listOf(1)) - - } - }) - } - - val onValue = NodeDef { - name = "on" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - dtype = DataType.DT_INT64 - Int64Data(listOf(1)) - - } - }) - } - - - val offValue = NodeDef { - name = "off" - op = "Const" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("value",AttrValue { - tensor = TensorProto { - dtype = DataType.DT_INT64 - Int64Data(listOf(0)) - - } - }) - } - - - val opNode = NodeDef { - Input("indices") - Input("depth") - Input("on") - Input("off") - op = tensorflowOpDef.name - name = "output" - Attribute("TI",AttrValue { - type = DataType.DT_INT64 - }) - Attribute("T",AttrValue { - type = DataType.DT_INT64 - }) - - Attribute("axis",AttrValue { - i = 0 - }) - } - - - - val graphDef = GraphDef { - Node(indices) - Node(depth) - Node(onValue) - Node(offValue) - Node(opNode) - } - - - val indicesVal = Nd4j.linspace(1, 4, 4) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT64) - val inputs = mapOf("indices" to indicesVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("indices"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "cross" -> { - val a = NodeDef { - name = "a" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val b = NodeDef { - name = "b" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val opNode = NodeDef { - Input("a") - Input("b") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - - } - - - - val graphDef = GraphDef { - Node(a) - Node(b) - Node(opNode) - } - - - val aVal = Nd4j.linspace(1, 27, 27) - .reshape(3,3,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - val bVal = Nd4j.linspace(1, 27, 27) - .reshape(3,3,3) - .castTo(org.nd4j.linalg.api.buffer.DataType.FLOAT) - - - val inputs = mapOf("a" to aVal,"b" to bVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("a","b"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "transpose" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val tensorNode2 = NodeDef { - op = "Const" - name = "perm" - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(listOf(0,1)) - Shape(listOf(2)) - dtype = DataType.DT_INT32 - } - }) - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val opNode = NodeDef { - Input("x") - Input("perm") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tperm",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(tensorNode2) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - - val inputs = mapOf("x" to xVal) - - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - } - "relu", "relu6" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T", AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - return listOf(GraphInput( - graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - }, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "depth_to_space","space_to_depth" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T", AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("data_format", AttrValue { - s = ByteString.copyFrom("NHWC".toByteArray(Charset.defaultCharset())) - }) - Attribute("block_size", AttrValue { - i = 2 - }) - } - - val xVal = Nd4j.linspace(1, 256, 256) - .reshape(4, 4,4,4) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - return listOf(GraphInput( - graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - }, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "softmax","digamma","diag","diag_part","lgamma" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype", AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T", AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - return listOf(GraphInput( - graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - }, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "cumsum","cumprod" -> { - val ret = ArrayList() - listOf(false,true).forEach { reverse -> - listOf(false,true).forEach { exclusive -> - val inputNames = listOf("x") - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val dimensions = listOf(1) - val tensorNode2 = NodeDef { - op = "Const" - name = "dimensions" - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(dimensions) - dtype = DataType.DT_INT32 - tensorShape = TensorShapeProto { - Dims(listOf()) - } - } - }) - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val opNode = NodeDef { - Input("x") - Input("dimensions") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("exclusive",AttrValue { - b = exclusive - }) - - Attribute("reverse",AttrValue { - b = reverse - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(tensorNode2) - Node(opNode) - } - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - val inputs = mapOf("x" to xVal) - ret.add(GraphInput( - graphDef =graphDef, inputNames = inputNames, - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - } - - return ret - - } - - "Assert" -> { - val tensorNode = NodeDef { - name = "condition" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - val tensorNode2 = NodeDef { - op = "Placeholder" - name = "data" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - println("Running op def for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("condition") - Input("data") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - ListDataType(listOf(DataType.DT_DOUBLE)) - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - Node(tensorNode2) - } - - val inputs = mapOf("data" to Nd4j.linspace(1,4,4).castTo( - org.nd4j.linalg.api.buffer.DataType.DOUBLE - ),"condition" to Nd4j.ones(2).addi(1).castTo(org.nd4j.linalg.api.buffer.DataType.BOOL)) - return listOf(GraphInput(graphDef = graphDef, - inputNames = listOf("condition","data"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - - "Where" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - println("Running op def for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - Input("x") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - val inputs = mapOf("x" to Nd4j.linspace(1,4,4).castTo( - org.nd4j.linalg.api.buffer.DataType.DOUBLE - )) - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - - - "boolean_or" -> { - println("Running op def for op ${tensorflowOpDef.name}") - val inputNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - - val secondNode = NodeDef { - name = "y" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - val opNode = NodeDef { - Input("x") - Input("y") - name = "and" - op = tensorflowOpDef.name - } - - - val inputs = mapOf("x" to Nd4j.ones(2,2).castTo( - org.nd4j.linalg.api.buffer.DataType.BOOL - ), "y" to Nd4j.zeros(2,2).castTo( - org.nd4j.linalg.api.buffer.DataType.BOOL - )) - - - val graphDef = GraphDef { - Node(inputNode) - Node(secondNode) - Node(opNode) - } - - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x","y"), - outputNames = listOf("and"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - - "boolean_and" -> { - println("Running op def for op ${tensorflowOpDef.name}") - val inputNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - - val secondNode = NodeDef { - name = "y" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - val opNode = NodeDef { - Input("x") - Input("y") - name = "and" - op = tensorflowOpDef.name - } - - - val inputs = mapOf("x" to Nd4j.ones(2,2).castTo( - org.nd4j.linalg.api.buffer.DataType.BOOL - ), "y" to Nd4j.zeros(2,2).castTo( - org.nd4j.linalg.api.buffer.DataType.BOOL - )) - - - val graphDef = GraphDef { - Node(inputNode) - Node(secondNode) - Node(opNode) - } - - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x","y"), - outputNames = listOf("and"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - - "igamma","igammac" -> { - println("Running op def for op ${tensorflowOpDef.name}") - val a = NodeDef { - name = "a" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - val x = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - - val opNode = NodeDef { - Input("a") - Input("x") - name = "igamma" - op = tensorflowOpDef.name - Attribute("T",AttrValue { - type = DataType.DT_FLOAT - }) - } - - - val inputs = mapOf("a" to Nd4j.ones(2,2).castTo( - org.nd4j.linalg.api.buffer.DataType.FLOAT - ),"x" to Nd4j.ones(2,2).castTo( - org.nd4j.linalg.api.buffer.DataType.FLOAT - )) - - val graphDef = GraphDef { - Node(a) - Node(x) - Node(opNode) - } - - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("a","x"), - outputNames = listOf("igamma"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - "boolean_not" -> { - println("Running op def for op ${tensorflowOpDef.name}") - val inputNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_BOOL - }) - } - - - - - val opNode = NodeDef { - Input("x") - name = "not" - op = tensorflowOpDef.name - } - - - val inputs = mapOf("x" to Nd4j.ones(2,2).castTo( - org.nd4j.linalg.api.buffer.DataType.BOOL - )) - - val graphDef = GraphDef { - Node(inputNode) - Node(opNode) - } - - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x"), - outputNames = listOf("not"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - - "noop" -> { - println("Running op def for op ${tensorflowOpDef.name}") - val opNode = NodeDef { - name = "noop" - op = tensorflowOpDef.name - } - - - - val graphDef = GraphDef { - Node(opNode) - } - - return listOf(GraphInput(graphDef = graphDef, - inputNames = listOf(), - outputNames = listOf(), - inputArrays = emptyMap(), - dynamicArrays = emptyMap())) - } - - "While" -> { - println("Running op def for op ${tensorflowOpDef.name}") - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - ListDataType(listOf(DataType.DT_DOUBLE)) - }) - } - - - val opNode = NodeDef { - Input("x") - name = "while" - op = tensorflowOpDef.name - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - val inputs = mapOf("x" to Nd4j.scalar(1.0)) - - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - "unique_with_counts","unique" -> { - println("Running op def for op ${tensorflowOpDef.name}") - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - if(tensorflowOpDef.name == "UniqueWithCountsV2" || tensorflowOpDef.name == "UniqueV2") { - val axis = NodeDef { - name = "axis" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT64 - }) - } - - - val opNode = NodeDef { - Input("x") - Input("axis") - name = "output" - op = tensorflowOpDef.name - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(axis) - Node(opNode) - } - - val inputs = mapOf("x" to Nd4j.linspace(1,4,4).reshape(2,2).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE), - "axis" to Nd4j.scalar(1).reshape(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT64)) - - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x","axis"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - else { - val opNode = NodeDef { - Input("x") - name = "output" - op = tensorflowOpDef.name - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - } - - val inputs = mapOf("x" to Nd4j.linspace(1,4,4).castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE)) - - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - } - - - "pad" -> { - if(tensorflowOpDef.name == "Pad") { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val tensorNode2 = NodeDef { - op = "Placeholder" - name = "paddings" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val opNode = NodeDef { - Input("x") - Input("paddings") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tpaddings",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - Node(tensorNode2) - } - - val inputs = mapOf("x" to Nd4j.linspace(1,4,4).castTo( - org.nd4j.linalg.api.buffer.DataType.DOUBLE - ),"paddings" to Nd4j.ones(1,2).addi(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT32)) - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x","paddings"),outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs)) - } else if(tensorflowOpDef.name == "PadV2"){ - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val tensorNode2 = NodeDef { - op = "Placeholder" - name = "paddings" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val constantValues = NodeDef { - op = "Const" - name = "constant_values" - Attribute("value",AttrValue { - tensor = TensorProto { - DoubleData(listOf(1.0)) - dtype = DataType.DT_DOUBLE - tensorShape = TensorShapeProto { - Dims(listOf()) - } - } - }) - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val opNode = NodeDef { - Input("x") - Input("paddings") - Input("constant_values") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tpaddings",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - Node(constantValues) - Node(tensorNode2) - - } - - val inputs = mapOf("x" to Nd4j.linspace(1,4,4).castTo( - org.nd4j.linalg.api.buffer.DataType.DOUBLE - ),"paddings" to Nd4j.ones(1,2).addi(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT32)) - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x","paddings"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs)) - } else { - throw IllegalArgumentException("Illegal mapping for padding op $tensorflowOpDef.name") - } - - } - - - "reshape" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val tensorNode2 = NodeDef { - op = "Placeholder" - name = "shape" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val opNode = NodeDef { - Input("x") - Input("shape") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(opNode) - Node(tensorNode2) - } - - val inputs = mapOf("x" to Nd4j.linspace(1,4,4).castTo( - org.nd4j.linalg.api.buffer.DataType.DOUBLE - ),"shape" to Nd4j.ones(2).addi(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT32)) - return listOf(GraphInput(graphDef = graphDef,inputNames = listOf("x","shape"),outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs)) - } - - "reduce_logsumexp" -> { - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val opNode = NodeDef { - Input("x") - Input("dimensions") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tidx",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("exclusive",AttrValue { - b = false - }) - } - - val dimensions = listOf(0) - val tensorNode2 = NodeDef { - op = "Const" - name = "dimensions" - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(dimensions) - dtype = DataType.DT_INT32 - tensorShape = TensorShapeProto { - Dims(listOf()) - } - } - }) - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(tensorNode) - Node(tensorNode2) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - return listOf(GraphInput( - graphDef =graphDef, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - "argmin", "argmax" -> { - val ret = ArrayList() - listOf(true, false).forEach { keepDim -> - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val opNode = NodeDef { - Input("x") - Input("dimensions") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tidx",AttrValue { - type = DataType.DT_INT32 - }) - } - - val dimensions = listOf(0) - val tensorNode2 = NodeDef { - op = "Const" - name = "dimensions" - Attribute("value",AttrValue { - tensor = TensorProto { - Int32Data(dimensions) - dtype = DataType.DT_INT32 - tensorShape = TensorShapeProto { - Dims(listOf()) - } - } - }) - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val graphDef = GraphDef { - Node(tensorNode) - Node(tensorNode2) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - ret.add(GraphInput( - graphDef =graphDef, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - return ret - } - - "pow" -> { - val ret = ArrayList() - val tensorNode = NodeDef { - name = "x" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val tensorNode2 = NodeDef { - op = "Const" - name = "y" - Attribute("value",AttrValue { - tensor = TensorProto { - DoubleData(listOf(1.0)) - dtype = DataType.DT_DOUBLE - tensorShape = TensorShapeProto { - Dims(listOf()) - } - } - }) - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val opNode = NodeDef { - Input("x") - Input("y") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(tensorNode2) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 4, 4) - .reshape(2, 2) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - val inputs = mapOf("x" to xVal) - - ret.add(GraphInput( - graphDef =graphDef, inputNames = listOf("x"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - return ret - } - - - - //scatter_div - //TODO: Revisit. TF op validation seems to be different than ours. - "scatter_add","scatter_sub","scatter_min","scatter_sub","scatter_min","scatter_mul","scatter_update","scatter_nd","scatter_nd_add","scatter_nd_sub","scatter_nd_update" -> { - val ret = ArrayList() - listOf(true,false).forEach { lock -> - val xRef = NodeDef { - name = "shape" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val tensorNode2 = NodeDef { - op = "Placeholder" - name = "indices" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val updates2 = NodeDef { - op = "Placeholder" - name = "updates" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val opNode = NodeDef { - Input("indices") - Input("updates") - Input("shape") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("Tindices",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val graphDef = GraphDef { - Node(xRef) - Node(tensorNode2) - Node(updates2) - Node(opNode) - } - - - //from testScatterOpGradients. - val shape = Nd4j.scalar(8).reshape(1).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val indices = Nd4j.create(floatArrayOf(4f,3f,1f,7f)).reshape(4,1) - .castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - val updates = Nd4j.linspace(1,4,4).reshape(4).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - - - val inputs = mapOf("shape" to shape,"updates" to updates,"indices" to indices) - - ret.add(GraphInput( - graphDef =graphDef, inputNames = listOf("indices","updates","shape"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - } - - - - - - - return ret - } - - - - - "segment_mean", "segment_min","segment_max","segment_prod","segment_sum" -> { - val ret = ArrayList() - val tensorNode = NodeDef { - name = "data" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val segmentIds = NodeDef { - op = "Placeholder" - name = "segment_ids" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - - val opNode = NodeDef { - Input("data") - Input("segment_ids") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tindices",AttrValue { - type = DataType.DT_INT32 - }) - - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(segmentIds) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 12, 12) - .reshape(3, 4) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - val indices = Nd4j.create(floatArrayOf(1.0f,2.0f,3.0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val inputs = mapOf("data" to xVal,"segment_ids" to indices) - - ret.add(GraphInput( - graphDef =graphDef, inputNames = listOf("data","segment_ids"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - - return ret - } - - - "unsorted_segment_sum", "unsorted_segment_prod","unsorted_segment_min","unsorted_segment_max" -> { - val ret = ArrayList() - val tensorNode = NodeDef { - name = "data" - op = "Placeholder" - Attribute("dtype",AttrValue { - type = DataType.DT_DOUBLE - }) - } - - val segmentIds = NodeDef { - op = "Placeholder" - name = "segment_ids" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - } - - val numSegmentsNode = NodeDef { - op = "Const" - name = "num_segments" - Attribute("dtype",AttrValue { - type = DataType.DT_INT32 - }) - - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(listOf()) - Int32Data(listOf(2)) - DataType(DataType.DT_INT32) - } - }) - } - - val opNode = NodeDef { - Input("data") - Input("segment_ids") - Input("num_segments") - op = tensorflowOpDef.name - name = "output" - Attribute("T",AttrValue { - type = DataType.DT_DOUBLE - }) - Attribute("Tindices",AttrValue { - type = DataType.DT_INT32 - }) - Attribute("Tnumsegments",AttrValue { - type = DataType.DT_INT32 - }) - } - - - - val graphDef = GraphDef { - Node(tensorNode) - Node(segmentIds) - Node(numSegmentsNode) - Node(opNode) - } - - - val xVal = Nd4j.linspace(1, 12, 12) - .reshape(3, 4) - .castTo(org.nd4j.linalg.api.buffer.DataType.DOUBLE) - - - val indices = Nd4j.create(floatArrayOf(0.0f,1.0f,0.0f)).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val numSegments = Nd4j.scalar(2).castTo(org.nd4j.linalg.api.buffer.DataType.INT32) - val inputs = mapOf("data" to xVal,"segment_ids" to indices,"num_segments" to numSegments) - - ret.add(GraphInput( - graphDef =graphDef, inputNames = listOf("data","segment_ids","num_segments"), - outputNames = listOf("output"), - inputArrays = inputs, - dynamicArrays = inputs - )) - - - return ret - } - - - else -> { - throw IllegalArgumentException("Illegal op name $inputFrameworkOpName") - } - } - } - -} - diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/tensorflow/TestTensorflowRuleDeclarations.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/tensorflow/TestTensorflowRuleDeclarations.kt deleted file mode 100644 index ed653e161..000000000 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ir/tensorflow/TestTensorflowRuleDeclarations.kt +++ /dev/null @@ -1,478 +0,0 @@ -package org.nd4j.codegen.ir.tensorflow - -import org.junit.jupiter.api.Test -import org.nd4j.codegen.ir.ArgDescriptor -import org.nd4j.ir.TensorNamespace -import org.nd4j.shade.protobuf.ByteString -import java.nio.charset.Charset -import kotlin.test.assertEquals -import kotlin.test.assertTrue - -class TestTensorflowRuleDeclarations { - - @Test - fun testArgConstant() { - val opDef = tensorflowOps.findOp("Dilation2D") - val intItems = listOf(2,1,1,1) - val valueNodeDef = NodeDef { - op = "Dilation2D" - name = "inputs" - Attribute(name = "strides",value = AttrValue { - list = ListValue { - IntItems(intItems) - } - }) - } - - val shape = listOf(1,1).map { it.toLong() } - val valueNodeDef2 = NodeDef { - op = "Constant" - name = "inputs" - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(shape) - DoubleData(listOf(1.0)) - } - }) - } - - - - val graphDef = GraphDef { - Node(valueNodeDef) - Node(valueNodeDef2) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val convertNumberListToInputNDArrayRule = argDescriptorConstant(listOf(ArgDescriptor { - name = "value" - int32Value = 1 - })) - - val convertNumberListToInputNDArrayResult = convertNumberListToInputNDArrayRule.convertAttributes(mappingContext) - - assertEquals(1,convertNumberListToInputNDArrayResult.size) - assertEquals(1,convertNumberListToInputNDArrayResult[0].int32Value) - } - - - - @Test - fun testConvertNDArrayInputToScalarAttr() { - val opDef = tensorflowOps.findOp("Dilation2D") - val intItems = listOf(2,1,1,1) - val valueNodeDef = NodeDef { - op = "Dilation2D" - name = "inputs" - Attribute(name = "strides",value = AttrValue { - list = ListValue { - IntItems(intItems) - } - }) - } - - val shape = listOf(1,1).map { it.toLong() } - val valueNodeDef2 = NodeDef { - op = "Constant" - name = "inputs" - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(shape) - DoubleData(listOf(1.0)) - } - }) - } - - - - val graphDef = GraphDef { - Node(valueNodeDef) - Node(valueNodeDef2) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val convertNumberListToInputNDArrayRule = convertNDArrayInputToNumericalAttr(mutableMapOf("output" to "inputs ")) - val convertNumberListToInputNDArrayResult = convertNumberListToInputNDArrayRule.convertAttributes(mappingContext) - assertEquals(1,convertNumberListToInputNDArrayResult.size) - assertEquals(2,convertNumberListToInputNDArrayResult[0].int64Value) - } - - @Test - fun testListAttributeValueLookupToIndex() { - val opDef = tensorflowOps.findOp("Dilation2D") - val intItems = listOf(2,1,1,1) - val valueNodeDef = NodeDef { - op = "Dilation2D" - name = "inputs" - Attribute(name = "strides",value = AttrValue { - list = ListValue { - IntItems(intItems) - } - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val convertNumberListToInputNDArrayRule = listAttributeValueLookupToIndex(outputAttributeValue = "output", inputAttributeValue = "strides", idx = 0,argumentIndex = 0) - val convertNumberListToInputNDArrayResult = convertNumberListToInputNDArrayRule.convertAttributes(mappingContext) - assertEquals(1,convertNumberListToInputNDArrayResult.size) - assertEquals(2,convertNumberListToInputNDArrayResult[0].int64Value) - } - - - @Test - fun testConvertNumberListToInputNDArray() { - val opDef = tensorflowOps.findOp("Dilation2D") - val intItems = listOf(1,1,1,1) - val valueNodeDef = NodeDef { - op = "Dilation2D" - name = "inputs" - Attribute(name = "strides",value = AttrValue { - list = ListValue { - IntItems(intItems) - } - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val convertNumberListToInputNDArrayRule = convertNumberListToInputNDArray(outputAttributeValue = "output",inputAttributeValue = "strides") - val convertNumberListToInputNDArrayResult = convertNumberListToInputNDArrayRule.convertAttributes(mappingContext) - assertEquals(1,convertNumberListToInputNDArrayResult.size) - val inputVal = convertNumberListToInputNDArrayResult[0].inputValue - assertEquals(2,inputVal.dimsCount) - val testList = inputVal.int64DataList - testList.forEach { - assertEquals(1,it) - } - } - - @Test - fun testValueMapping() { - val opDef = tensorflowOps.findOp("CudnnRNN") - val valueNodeDef = NodeDef { - op = "CudnnRNN" - name = "inputs" - Attribute(name = "is_training",value = AttrValue { - b = true - }) - Attribute(name = "seed",value = AttrValue { - i = 1 - }) - Attribute(name = "dropout",value = AttrValue { - f = 1.0f - }) - Attribute(name = "direction",value = AttrValue { - s = ByteString.copyFrom("unidirectional".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val booleanToInt = valueMapping(mapOf("output" to "is_training","output2" to "seed","output3" to "dropout","output4" to "direction")) - val booleanToIntResult = booleanToInt.convertAttributes(mappingContext) - assertEquals(4,booleanToIntResult.size) - val boolValue = booleanToIntResult.first { it.name == "output" }.boolValue - val intValue = booleanToIntResult.first {it.name == "output2" }.int64Value - val floatValue = booleanToIntResult.first {it.name == "output3"}.floatValue - val stringVal = booleanToIntResult.first {it.name == "output4" }.stringValue - assertEquals(true,boolValue) - assertEquals(1,intValue) - assertEquals(1.0f,floatValue) - assertEquals("unidirectional",stringVal) - } - - @Test - fun testBooleanToInt() { - val opDef = tensorflowOps.findOp("CudnnRNN") - val valueNodeDef = NodeDef { - op = "CudnnRNN" - name = "inputs" - Attribute(name = "is_training",value = AttrValue { - b = true - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val booleanToInt = invertBooleanNumber(mapOf("output" to "is_training")) - val booleanToIntResult = booleanToInt.convertAttributes(mappingContext) - assertEquals(1,booleanToIntResult.size) - val boolValue = booleanToIntResult[0].int64Value - assertEquals(1,boolValue) - } - - @Test - fun testAttributeScalarToNDArrayInputRuleDouble() { - val opDef = tensorflowOps.findOp("CudnnRNN") - val valueNodeDef = NodeDef { - op = "CudnnRNN" - name = "inputs" - Attribute(name = "dropout",value = AttrValue { - f = 1.0f - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val ndarrScalarRule = attributeScalarToNDArrayInput(outputAttribute = "output",inputFrameworkAttributeName = "dropout") - val ndarrScalarRuleResult = ndarrScalarRule.convertAttributes(mappingContext) - assertEquals(1,ndarrScalarRuleResult.size) - assertTrue {ndarrScalarRuleResult[0].hasInputValue()} - val tensorValue = ndarrScalarRuleResult[0].inputValue - assertEquals(2,tensorValue.dimsCount) - assertEquals(TensorNamespace.DataType.FLOAT.ordinal,tensorValue.dataType) - val floatValue = tensorValue.floatDataList[0] - assertEquals(1.0f,floatValue) - } - - @Test - fun testAttributeScalarToNDArrayInputRuleInt() { - val opDef = tensorflowOps.findOp("CountUpTo") - val valueNodeDef = NodeDef { - op = "CountUpTo" - name = "inputs" - Attribute(name = "limit",value = AttrValue { - i = 1 - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - val ndarrScalarRule = attributeScalarToNDArrayInput(outputAttribute = "output",inputFrameworkAttributeName = "limit") - val ndarrScalarRuleResult = ndarrScalarRule.convertAttributes(mappingContext) - assertEquals(1,ndarrScalarRuleResult.size) - assertTrue {ndarrScalarRuleResult[0].hasInputValue()} - val tensorValue = ndarrScalarRuleResult[0].inputValue - assertEquals(2,tensorValue.dimsCount) - assertEquals(TensorNamespace.DataType.INT64.ordinal,tensorValue.dataType) - val intValue = tensorValue.int64DataList[0] - assertEquals(1,intValue) - } - - @Test - fun testStringNotEqualsRule() { - val opDef = tensorflowOps.findOp("Const") - val valueNodeDef = NodeDef { - op = "Const" - name = "inputs" - Attribute(name = "value",value = AttrValue { - s = ByteString.copyFrom("value".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - listOf("value","notValue").zip(listOf(false,true)).forEach { (valueToTest,assertionResult) -> - val stringNotEqualsRule = stringNotEqualsRule(outputAttribute = "output",inputFrameworkAttributeName = "value",valueToTest = valueToTest,argumentIndex = 0) - val stringEqualsResult = stringNotEqualsRule.convertAttributes(mappingCtx = mappingContext) - assertEquals(1,stringEqualsResult.size) - assertEquals(assertionResult,stringEqualsResult[0].boolValue) - - } - - - } - - - @Test - fun testStringContainsRule() { - val opDef = tensorflowOps.findOp("Const") - val valueNodeDef = NodeDef { - op = "Const" - name = "inputs" - Attribute(name = "value",value = AttrValue { - s = ByteString.copyFrom("value".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - listOf("value","notValue").zip(listOf(true,false)).forEach { (valueToTest,assertionResult) -> - val stringContainsRule = stringContainsRule(outputAttribute = "output",inputFrameworkAttributeName = "value",valueToTest = valueToTest) - val stringEqualsResult = stringContainsRule.convertAttributes(mappingCtx = mappingContext) - assertEquals(1,stringEqualsResult.size) - assertEquals(assertionResult,stringEqualsResult[0].boolValue) - - } - - - } - - - @Test - fun testStringEqualsRule() { - val opDef = tensorflowOps.findOp("Const") - val valueNodeDef = NodeDef { - op = "Const" - name = "inputs" - Attribute(name = "value",value = AttrValue { - s = ByteString.copyFrom("value".toByteArray(Charset.defaultCharset())) - }) - } - - - val graphDef = GraphDef { - Node(valueNodeDef) - - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = valueNodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - listOf("value","notValue").zip(listOf(true,false)).forEach { (valueToTest,assertionResult) -> - val stringEqualsRule = stringEqualsRule(outputAttribute = "output",inputFrameworkAttributeName = "value",valueToTest = valueToTest,argumentIndex = 0) - val stringEqualsResult = stringEqualsRule.convertAttributes(mappingCtx = mappingContext) - assertEquals(1,stringEqualsResult.size) - assertEquals(assertionResult,stringEqualsResult[0].boolValue) - - } - - - } - - - @Test - fun testNDArraySizeAtRule() { - val opDef = tensorflowOps.findOp("AddN") - val nodeDef = NodeDef { - op = "AddN" - Input("inputs") - Input("y") - name = "test" - } - - val shape = listOf(1,2).map { it.toLong() } - - val valueNodeDef = NodeDef { - op = "Constant" - name = "inputs" - Attribute(name = "value",value = AttrValue { - tensor = TensorProto { - Shape(shape) - DoubleData(listOf(1.0,2.0)) - } - }) - } - - - val graphDef = GraphDef { - Node(nodeDef) - Node(valueNodeDef) - - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - val mappingContext = TensorflowMappingContext(opDef = opDef,node = nodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - shape.forEachIndexed { i,value -> - val sizeAtRule = sizeAtRule(dimensionIndex = i,outputAttributeName = "output",inputFrameworkAttributeName = "inputs",argumentIndex = 0) - val sizeAtRuleResult = sizeAtRule.convertAttributes(mappingCtx = mappingContext) - assertEquals(1,sizeAtRuleResult.size) - assertEquals(value,sizeAtRuleResult[0].int64Value) - - } - - } - - - @Test - fun testConditionalIndex() { - - val opDef = tensorflowOps.findOp("AddN") - val strings = listOf("value","falseValue") - //when item is equal to value return element at index 0 - //when item is not equal to value return element at index 1 - val assertionValue = mapOf("value" to 1,"falseValue" to 0) - val trueIndex = 0 - val falseIndex = 1 - val listOfItemsForTesting = listOf(1,0,2,3) - //true and false case with index 1 - for(string in strings) { - val nodeDef = NodeDef { - op = "AddN" - Input("inputs") - Input("y") - name = "test" - Attribute(name = "N",value = AttrValue { - name = "N" - list = ListValue { - IntItems(listOfItemsForTesting) - } - }) - Attribute(name = "T",value = AttrValue { - name = "T" - s = ByteString.copyFrom(string.toByteArray(Charset.defaultCharset())) - }) - } - - val graphDef = GraphDef { - Node(nodeDef) - } - - val tfGraph = TensorflowIRGraph(graphDef, tensorflowOps) - - - val mappingContext = TensorflowMappingContext(opDef = opDef,node = nodeDef,graph = tfGraph,dynamicVariables = emptyMap()) - - val conditionalIndex = conditionalFieldValueIntIndexArrayRule( - outputAttribute = "N", - attributeNameOfListAttribute = "N", - targetValue = "value", trueIndex = trueIndex, falseIndex = falseIndex, - inputFrameworkStringNameToTest = "T",argumentIndex = 0) - - val ret = conditionalIndex.convertAttributes(mappingContext) - assertEquals(1,ret.size) - assertEquals((assertionValue[string] ?: - error("No value found with string value $string")).toLong(),ret[0].int64Value) - assertEquals("N",ret[0].name) - - } - - } -} - - - diff --git a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ops/ConstructionTest.kt b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ops/ConstructionTest.kt index 44a43af28..4ab5562c4 100644 --- a/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ops/ConstructionTest.kt +++ b/contrib/codegen-tools/codegen/src/test/kotlin/org/nd4j/codegen/ops/ConstructionTest.kt @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.codegen.ops import org.junit.jupiter.api.Test diff --git a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/OpDeclarationDescriptor.java b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/OpDeclarationDescriptor.java index b669d65ac..c4ee1c0bf 100644 --- a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/OpDeclarationDescriptor.java +++ b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/OpDeclarationDescriptor.java @@ -1,18 +1,23 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit KK. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.descriptor; import lombok.Builder; diff --git a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/ParseOpFile.java b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/ParseOpFile.java index 0af468f19..61e518201 100644 --- a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/ParseOpFile.java +++ b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/ParseOpFile.java @@ -1,18 +1,23 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit KK. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.descriptor; import org.apache.commons.io.FileUtils; diff --git a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/ArgDescriptorProposal.java b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/ArgDescriptorProposal.java index e9b961d81..dce986568 100644 --- a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/ArgDescriptorProposal.java +++ b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/ArgDescriptorProposal.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.descriptor.proposal; import lombok.Builder; diff --git a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/ArgDescriptorSource.java b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/ArgDescriptorSource.java index 628b8537e..9b281c4df 100644 --- a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/ArgDescriptorSource.java +++ b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/ArgDescriptorSource.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.descriptor.proposal; import org.nd4j.ir.OpNamespace; diff --git a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/ArgDescriptorParserUtils.java b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/ArgDescriptorParserUtils.java index 9ff2b4d06..497549e02 100644 --- a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/ArgDescriptorParserUtils.java +++ b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/ArgDescriptorParserUtils.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.descriptor.proposal.impl; import com.github.javaparser.ast.expr.MethodCallExpr; diff --git a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/JavaSourceArgDescriptorSource.java b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/JavaSourceArgDescriptorSource.java index 1875ab70f..cd63578d2 100644 --- a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/JavaSourceArgDescriptorSource.java +++ b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/JavaSourceArgDescriptorSource.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.descriptor.proposal.impl; import com.github.javaparser.ParserConfiguration; diff --git a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/Libnd4jArgDescriptorSource.java b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/Libnd4jArgDescriptorSource.java index c20153def..b5b3f1060 100644 --- a/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/Libnd4jArgDescriptorSource.java +++ b/contrib/codegen-tools/libnd4j-gen/src/main/java/org/nd4j/descriptor/proposal/impl/Libnd4jArgDescriptorSource.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.descriptor.proposal.impl; import lombok.Builder; diff --git a/contrib/deeplearning4j-nlp-uima/pom.xml b/contrib/deeplearning4j-nlp-uima/pom.xml deleted file mode 100644 index 3e3cc8f5f..000000000 --- a/contrib/deeplearning4j-nlp-uima/pom.xml +++ /dev/null @@ -1,103 +0,0 @@ - - - - - - 4.0.0 - - - org.deeplearning4j - deeplearning4j-nlp-parent - 1.0.0-SNAPSHOT - - - deeplearning4j-nlp-uima - - deeplearning4j-nlp-uima - - - 1.8 - 1.8 - - - - - org.cleartk - cleartk-snowball - ${cleartk.version} - - - org.cleartk - cleartk-opennlp-tools - ${cleartk.version} - - - org.deeplearning4j - deeplearning4j-nlp - ${project.version} - - - junit - junit - - - org.mockito - mockito-core - ${mockito.version} - test - - - ch.qos.logback - logback-classic - test - - - org.deeplearning4j - deeplearning4j-ui - ${project.version} - test - - - org.deeplearning4j - deeplearning4j-common-tests - ${project.version} - test - - - - org.springframework - spring-core - - - - - - - - test-nd4j-native - - - test-nd4j-cuda-11.0 - - - diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/PoStagger.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/PoStagger.java deleted file mode 100644 index 67143fda5..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/PoStagger.java +++ /dev/null @@ -1,238 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.annotator; - -import opennlp.tools.postag.POSModel; -import opennlp.tools.postag.POSTaggerME; -import opennlp.uima.postag.POSModelResource; -import opennlp.uima.postag.POSModelResourceImpl; -import opennlp.uima.util.AnnotationComboIterator; -import opennlp.uima.util.AnnotationIteratorPair; -import opennlp.uima.util.AnnotatorUtil; -import opennlp.uima.util.UimaUtil; -import org.apache.uima.UimaContext; -import org.apache.uima.analysis_engine.AnalysisEngineDescription; -import org.apache.uima.analysis_engine.AnalysisEngineProcessException; -import org.apache.uima.cas.CAS; -import org.apache.uima.cas.Feature; -import org.apache.uima.cas.Type; -import org.apache.uima.cas.TypeSystem; -import org.apache.uima.cas.text.AnnotationFS; -import org.apache.uima.fit.component.CasAnnotator_ImplBase; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.fit.factory.ExternalResourceFactory; -import org.apache.uima.resource.ResourceAccessException; -import org.apache.uima.resource.ResourceInitializationException; -import org.apache.uima.util.Level; -import org.apache.uima.util.Logger; -import org.cleartk.token.type.Sentence; -import org.cleartk.token.type.Token; -import org.deeplearning4j.text.movingwindow.Util; - -import java.util.Iterator; -import java.util.LinkedList; -import java.util.List; - - - -public class PoStagger extends CasAnnotator_ImplBase { - - static { - //UIMA logging - Util.disableLogging(); - } - - private POSTaggerME posTagger; - - private Type sentenceType; - - private Type tokenType; - - private Feature posFeature; - - private Feature probabilityFeature; - - private UimaContext context; - - private Logger logger; - - /** - * Initializes a new instance. - * - * Note: Use {@link #initialize(UimaContext) } to initialize this instance. Not use the - * constructor. - */ - public PoStagger() { - // must not be implemented ! - } - - /** - * Initializes the current instance with the given context. - * - * Note: Do all initialization in this method, do not use the constructor. - */ - @Override - public void initialize(UimaContext context) throws ResourceInitializationException { - - super.initialize(context); - - this.context = context; - - this.logger = context.getLogger(); - - if (this.logger.isLoggable(Level.INFO)) { - this.logger.log(Level.INFO, "Initializing the OpenNLP " + "Part of Speech annotator."); - } - - POSModel model; - - try { - POSModelResource modelResource = (POSModelResource) context.getResourceObject(UimaUtil.MODEL_PARAMETER); - - model = modelResource.getModel(); - } catch (ResourceAccessException e) { - throw new ResourceInitializationException(e); - } - - Integer beamSize = AnnotatorUtil.getOptionalIntegerParameter(context, UimaUtil.BEAM_SIZE_PARAMETER); - - if (beamSize == null) - beamSize = POSTaggerME.DEFAULT_BEAM_SIZE; - - this.posTagger = new POSTaggerME(model, beamSize, 0); - } - - /** - * Initializes the type system. - */ - @Override - public void typeSystemInit(TypeSystem typeSystem) throws AnalysisEngineProcessException { - - // sentence type - this.sentenceType = AnnotatorUtil.getRequiredTypeParameter(this.context, typeSystem, - UimaUtil.SENTENCE_TYPE_PARAMETER); - - // token type - this.tokenType = AnnotatorUtil.getRequiredTypeParameter(this.context, typeSystem, - UimaUtil.TOKEN_TYPE_PARAMETER); - - // pos feature - this.posFeature = AnnotatorUtil.getRequiredFeatureParameter(this.context, this.tokenType, - UimaUtil.POS_FEATURE_PARAMETER, CAS.TYPE_NAME_STRING); - - this.probabilityFeature = AnnotatorUtil.getOptionalFeatureParameter(this.context, this.tokenType, - UimaUtil.PROBABILITY_FEATURE_PARAMETER, CAS.TYPE_NAME_DOUBLE); - } - - /** - * Performs pos-tagging on the given tcas object. - */ - @Override - public synchronized void process(CAS tcas) { - - final AnnotationComboIterator comboIterator = - new AnnotationComboIterator(tcas, this.sentenceType, this.tokenType); - - for (AnnotationIteratorPair annotationIteratorPair : comboIterator) { - - final List sentenceTokenAnnotationList = new LinkedList<>(); - - final List sentenceTokenList = new LinkedList<>(); - - for (AnnotationFS tokenAnnotation : annotationIteratorPair.getSubIterator()) { - - sentenceTokenAnnotationList.add(tokenAnnotation); - - sentenceTokenList.add(tokenAnnotation.getCoveredText()); - } - - final List posTags = this.posTagger.tag(sentenceTokenList); - - double posProbabilities[] = null; - - if (this.probabilityFeature != null) { - posProbabilities = this.posTagger.probs(); - } - - final Iterator posTagIterator = posTags.iterator(); - final Iterator sentenceTokenIterator = sentenceTokenAnnotationList.iterator(); - - int index = 0; - while (posTagIterator.hasNext() && sentenceTokenIterator.hasNext()) { - final String posTag = posTagIterator.next(); - final AnnotationFS tokenAnnotation = sentenceTokenIterator.next(); - - tokenAnnotation.setStringValue(this.posFeature, posTag); - - if (posProbabilities != null) { - tokenAnnotation.setDoubleValue(this.posFeature, posProbabilities[index]); - } - - index++; - } - - // log tokens with pos - if (this.logger.isLoggable(Level.FINER)) { - - final StringBuilder sentenceWithPos = new StringBuilder(); - - sentenceWithPos.append("\""); - - for (final Iterator it = sentenceTokenAnnotationList.iterator(); it.hasNext();) { - final AnnotationFS token = it.next(); - sentenceWithPos.append(token.getCoveredText()); - sentenceWithPos.append('\\'); - sentenceWithPos.append(token.getStringValue(this.posFeature)); - sentenceWithPos.append(' '); - } - - // delete last whitespace - if (sentenceWithPos.length() > 1) // not 0 because it contains already the " char - sentenceWithPos.setLength(sentenceWithPos.length() - 1); - - sentenceWithPos.append("\""); - - this.logger.log(Level.FINER, sentenceWithPos.toString()); - } - } - } - - /** - * Releases allocated resources. - */ - @Override - public void destroy() { - this.posTagger = null; - } - - - public static AnalysisEngineDescription getDescription(String languageCode) throws ResourceInitializationException { - String modelPath = String.format("/models/%s-pos-maxent.bin", languageCode); - return AnalysisEngineFactory.createEngineDescription(PoStagger.class, - opennlp.uima.util.UimaUtil.MODEL_PARAMETER, - ExternalResourceFactory.createExternalResourceDescription(POSModelResourceImpl.class, - PoStagger.class.getResource(modelPath).toString()), - opennlp.uima.util.UimaUtil.SENTENCE_TYPE_PARAMETER, Sentence.class.getName(), - opennlp.uima.util.UimaUtil.TOKEN_TYPE_PARAMETER, Token.class.getName(), - opennlp.uima.util.UimaUtil.POS_FEATURE_PARAMETER, "pos"); - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/SentenceAnnotator.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/SentenceAnnotator.java deleted file mode 100644 index 9922470e1..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/SentenceAnnotator.java +++ /dev/null @@ -1,50 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.annotator; - -import org.apache.uima.analysis_engine.AnalysisEngineDescription; -import org.apache.uima.analysis_engine.AnalysisEngineProcessException; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.jcas.JCas; -import org.apache.uima.resource.ResourceInitializationException; -import org.cleartk.util.ParamUtil; -import org.deeplearning4j.text.movingwindow.Util; - -public class SentenceAnnotator extends org.cleartk.opennlp.tools.SentenceAnnotator { - - static { - //UIMA logging - Util.disableLogging(); - } - - public static AnalysisEngineDescription getDescription() throws ResourceInitializationException { - return AnalysisEngineFactory.createPrimitiveDescription(SentenceAnnotator.class, PARAM_SENTENCE_MODEL_PATH, - ParamUtil.getParameterValue(PARAM_SENTENCE_MODEL_PATH, "/models/en-sent.bin"), - PARAM_WINDOW_CLASS_NAMES, ParamUtil.getParameterValue(PARAM_WINDOW_CLASS_NAMES, null)); - } - - - @Override - public synchronized void process(JCas jCas) throws AnalysisEngineProcessException { - super.process(jCas); - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/StemmerAnnotator.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/StemmerAnnotator.java deleted file mode 100644 index ea5654339..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/StemmerAnnotator.java +++ /dev/null @@ -1,56 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.annotator; - -import org.apache.uima.analysis_engine.AnalysisEngineDescription; -import org.apache.uima.analysis_engine.AnalysisEngineProcessException; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.jcas.JCas; -import org.apache.uima.resource.ResourceInitializationException; -import org.cleartk.snowball.SnowballStemmer; -import org.cleartk.token.type.Token; - - -public class StemmerAnnotator extends SnowballStemmer { - - public static AnalysisEngineDescription getDescription() throws ResourceInitializationException { - return getDescription("English"); - } - - - public static AnalysisEngineDescription getDescription(String language) throws ResourceInitializationException { - return AnalysisEngineFactory.createPrimitiveDescription(StemmerAnnotator.class, - SnowballStemmer.PARAM_STEMMER_NAME, language); - } - - - @SuppressWarnings("unchecked") - @Override - public synchronized void process(JCas jCas) throws AnalysisEngineProcessException { - super.process(jCas); - } - - - - @Override - public void setStem(Token token, String stem) { - token.setStem(stem); - } - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/TokenizerAnnotator.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/TokenizerAnnotator.java deleted file mode 100644 index 49a83b6be..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/annotator/TokenizerAnnotator.java +++ /dev/null @@ -1,67 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.annotator; - -import opennlp.uima.tokenize.TokenizerModelResourceImpl; -import org.apache.uima.analysis_engine.AnalysisEngineDescription; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.fit.factory.ExternalResourceFactory; -import org.apache.uima.resource.ResourceInitializationException; -import org.cleartk.opennlp.tools.Tokenizer; -import org.cleartk.token.type.Sentence; -import org.cleartk.token.type.Token; -import org.deeplearning4j.nlp.uima.tokenization.tokenizer.ConcurrentTokenizer; -import org.deeplearning4j.text.movingwindow.Util; - -import static org.apache.uima.fit.factory.AnalysisEngineFactory.createEngineDescription; - - -/** - * Overrides OpenNLP tokenizer to be thread safe - */ -public class TokenizerAnnotator extends Tokenizer { - - static { - //UIMA logging - Util.disableLogging(); - } - - public static AnalysisEngineDescription getDescription(String languageCode) throws ResourceInitializationException { - String modelPath = String.format("/models/%s-token.bin", languageCode); - return AnalysisEngineFactory.createEngineDescription(ConcurrentTokenizer.class, opennlp.uima.util.UimaUtil.MODEL_PARAMETER, - ExternalResourceFactory.createExternalResourceDescription(TokenizerModelResourceImpl.class, - ConcurrentTokenizer.class.getResource(modelPath).toString()), - opennlp.uima.util.UimaUtil.SENTENCE_TYPE_PARAMETER, Sentence.class.getName(), - opennlp.uima.util.UimaUtil.TOKEN_TYPE_PARAMETER, Token.class.getName()); - } - - - - public static AnalysisEngineDescription getDescription() throws ResourceInitializationException { - String modelPath = String.format("/models/%s-token.bin", "en"); - return createEngineDescription(ConcurrentTokenizer.class, opennlp.uima.util.UimaUtil.MODEL_PARAMETER, - ExternalResourceFactory.createExternalResourceDescription(TokenizerModelResourceImpl.class, - ConcurrentTokenizer.class.getResource(modelPath).toString()), - opennlp.uima.util.UimaUtil.SENTENCE_TYPE_PARAMETER, Sentence.class.getName(), - opennlp.uima.util.UimaUtil.TOKEN_TYPE_PARAMETER, Token.class.getName()); - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/sentiwordnet/SWN3.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/sentiwordnet/SWN3.java deleted file mode 100644 index 2c41fbde5..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/sentiwordnet/SWN3.java +++ /dev/null @@ -1,250 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.sentiwordnet; - -import lombok.extern.slf4j.Slf4j; -import org.deeplearning4j.nlp.uima.tokenization.tokenizerfactory.UimaTokenizerFactory; -import org.nd4j.shade.guava.collect.Sets; -import org.apache.uima.analysis_engine.AnalysisEngine; -import org.apache.uima.cas.CAS; -import org.apache.uima.cas.CASException; -import org.apache.uima.fit.util.JCasUtil; -import org.cleartk.token.type.Sentence; -import org.cleartk.token.type.Token; -import org.nd4j.common.io.ClassPathResource; - -import java.io.BufferedReader; -import java.io.IOException; -import java.io.InputStreamReader; -import java.io.Serializable; -import java.util.*; - -/** - * Based on SentiWordnet - * @author Adam Gibson - * - */ -@Slf4j -public class SWN3 implements Serializable { - /** - * - */ - private static final long serialVersionUID = -2614454572930777658L; - private HashMap _dict; - private Set negationWords = Sets.newHashSet("could", "would", "should", "not", "isn't", "aren't", "wasn't", - "weren't", "haven't", "doesn't", "didn't", "don't"); - private AnalysisEngine analysisEngine; - - public SWN3() throws Exception { - this(UimaTokenizerFactory.defaultAnalysisEngine()); - } - - public SWN3(AnalysisEngine analysisEngine) { - this("/sentiment/sentiwordnet.txt"); - this.analysisEngine = analysisEngine; - } - - public SWN3(String sentiWordNetPath) { - - _dict = new HashMap<>(); - HashMap> _temp = new HashMap<>(); - - ClassPathResource resource = new ClassPathResource(sentiWordNetPath); - BufferedReader csv = null; - try { - csv = new BufferedReader(new InputStreamReader(resource.getInputStream())); - String line = ""; - while ((line = csv.readLine()) != null) { - if (line.isEmpty()) - continue; - String[] data = line.split("\t"); - - if (data[2].isEmpty() || data[3].isEmpty()) - continue; - Double score = Double.parseDouble(data[2]) - Double.parseDouble(data[3]); - String[] words = data[4].split(" "); - for (String w : words) { - if (w.isEmpty()) - continue; - - String[] w_n = w.split("#"); - w_n[0] += "#" + data[0]; - int index = Integer.parseInt(w_n[1]) - 1; - if (_temp.containsKey(w_n[0])) { - List l = _temp.get(w_n[0]); - if (index > l.size()) - for (int i = l.size(); i < index; i++) - l.add(0.0); - l.add(index, score); - _temp.put(w_n[0], l); - } else { - List l = new ArrayList<>(); - for (int i = 0; i < index; i++) - l.add(0.0); - l.add(index, score); - _temp.put(w_n[0], l); - } - } - } - - - Set temp = _temp.keySet(); - for (Iterator iterator = temp.iterator(); iterator.hasNext();) { - String word = iterator.next(); - List l = _temp.get(word); - double score = 0.0; - double sum = 0.0; - for (int i = 0; i < l.size(); i++) - score += ((double) 1 / (double) (i + 1)) * l.get(i); - for (int i = 1; i <= l.size(); i++) - sum += (double) 1 / (double) i; - score /= sum; - _dict.put(word, score); - } - } catch (Exception e) { - throw new RuntimeException(e); - } finally { - if (csv != null) { - try { - csv.close(); - } catch (IOException e) { - log.error("",e); - } - } - } - } - - - /** - * Classifies the given text - * @param text the text to classify - * @return the classification for the text - * @throws Exception - */ - public String classify(String text) throws Exception { - return this.classForScore(score(text)); - } - - /** - * Scores the text - * @param words the text to score - * @return the score (polarity) for the text - * @throws Exception - */ - public double score(String words) throws Exception { - CAS cas = analysisEngine.newCAS(); - cas.setDocumentText(words); - analysisEngine.process(cas); - return score(cas); - } - - - public String classForScore(Double score) { - String sent = "neutral"; - if (score >= 0.75) - sent = "strong_positive"; - else if (score > 0.25 && score <= 0.5) - sent = "positive"; - else if (score > 0 && score >= 0.25) - sent = "weak_positive"; - else if (score < 0 && score >= -0.25) - sent = "weak_negative"; - else if (score < -0.25 && score >= -0.5) - sent = "negative"; - else if (score <= -0.75) - sent = "strong_negative"; - return sent; - } - - - public String classify(CAS cas) throws CASException { - return classForScore(score(cas)); - } - - - - public double scoreTokens(List tokens) { - double totalScore = 0.0; - Set negativeWords = new HashSet<>(); - double scoreForSentence = 0.0; - for (Token token : tokens) { - scoreForSentence += extract(token.getCoveredText().toLowerCase()); - if (negationWords.contains(token.getCoveredText())) { - negativeWords.add(token.getCoveredText()); - } - } - //flip for context - if (!negativeWords.isEmpty()) { - scoreForSentence *= -1.0; - } - - totalScore += scoreForSentence; - return totalScore; - } - - - - public double score(CAS cas) throws CASException { - double totalScore = 0.0; - for (Sentence sentence : JCasUtil.select(cas.getJCas(), Sentence.class)) { - totalScore += scoreTokens(JCasUtil.selectCovered(Token.class, sentence)); - } - - return totalScore; - } - - - public String classify(Sentence sentence) { - double totalScore = 0.0; - for (Token token : JCasUtil.selectCovered(Token.class, sentence)) { - totalScore += extract(token.getCoveredText().toLowerCase()); - } - return classForScore(totalScore); - } - - - public double score(Sentence sentence) { - double totalScore = 0.0; - for (Token token : JCasUtil.selectCovered(Token.class, sentence)) { - totalScore += extract(token.getCoveredText().toLowerCase()); - } - return totalScore; - } - - - public Double extract(String word) { - double total = 0.0; - if (_dict.get(word + "#n") != null) - total = _dict.get(word + "#n") + total; - if (_dict.get(word + "#a") != null) - total = _dict.get(word + "#a") + total; - if (_dict.get(word + "#r") != null) - total = _dict.get(word + "#r") + total; - if (_dict.get(word + "#v") != null) - total = _dict.get(word + "#v") + total; - return total; - } - - - public static void main(String[] args) { - SWN3 swn = new SWN3("/sentiment/sentiwordnet.txt"); - System.out.println(swn.classForScore(swn.extract("sad"))); - - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/BinarizeTreeTransformer.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/BinarizeTreeTransformer.java deleted file mode 100644 index d848909a7..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/BinarizeTreeTransformer.java +++ /dev/null @@ -1,150 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.treeparser; - -import org.apache.commons.lang3.StringUtils; -import org.deeplearning4j.nlp.uima.corpora.treeparser.transformer.TreeTransformer; -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; -import org.nd4j.common.primitives.Pair; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.util.*; - -/** - * Binarizes trees. - * Based on the work by Manning et. al in stanford corenlp - * - * @author Adam Gibson - */ -public class BinarizeTreeTransformer implements TreeTransformer { - - private String factor = "left"; - private int horizontonalMarkov = 999; - - - - private static final Logger log = LoggerFactory.getLogger(BinarizeTreeTransformer.class); - - @Override - public Tree transform(Tree t) { - if (t == null) - return null; - Deque> stack = new ArrayDeque<>(); - stack.add(new Pair<>(t, t.label())); - String originalLabel = t.label(); - while (!stack.isEmpty()) { - Pair curr = stack.pop(); - Tree node = curr.getFirst(); - - for (Tree child : node.children()) - stack.add(new Pair<>(child, curr.getSecond())); - - - if (node.children().size() > 2) { - - List children = new ArrayList<>(); - for (int i = 0; i < node.children().size(); i++) - children.add(node.children().get(i).label()); - - Tree copy = node.clone(); - //clear out children - node.children().clear(); - - Tree currNode = node; - - for (int i = 1; i < children.size() - 1; i++) { - if (factor.equals("right")) { - Tree newNode = new Tree(currNode); - - List subChildren = - children.subList(i, Math.min(i + horizontonalMarkov, children.size())); - - newNode.setLabel(originalLabel + "-" + "(" + StringUtils.join(subChildren, "-")); - - newNode.setParent(currNode); - - currNode.children().add(copy.children().remove(0)); - - currNode.firstChild().setParent(currNode); - - currNode.children().add(newNode); - - currNode = newNode; - - } else { - Tree newNode = new Tree(currNode); - - newNode.setParent(copy.firstChild()); - - List childLabels = - children.subList(Math.max(children.size() - i - horizontonalMarkov, 0), i); - - Collections.reverse(childLabels); - newNode.setLabel(originalLabel + "-" + "(" + StringUtils.join(childLabels, "-")); - - currNode.children().add(newNode); - - currNode.firstChild().setParent(currNode); - - currNode.children().add(copy.children().remove(copy.children().size() - 1)); - currNode.lastChild().setParent(currNode); - - currNode = newNode; - } - } - - currNode.children().addAll(new ArrayList<>(copy.children())); - } - } - - addPreTerminal(t); - return t; - } - - private void addPreTerminal(Tree t) { - if (t.isLeaf()) { - Tree newLeaf = new Tree(t); - newLeaf.setLabel(t.value()); - t.children().add(newLeaf); - newLeaf.setParent(t); - } else { - for (Tree child : t.children()) - addPreTerminal(child); - } - } - - - private void checkState(Tree tree, Set nonBinarized) { - for (Tree t : tree.children()) { - checkState(t, nonBinarized); - } - - if (tree.children().size() > 2) { - Tree parent = tree.parent(); - if (parent == null) - return; - nonBinarized.add(tree); - - } - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/CollapseUnaries.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/CollapseUnaries.java deleted file mode 100644 index 8f1aaf830..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/CollapseUnaries.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.treeparser; - -import org.deeplearning4j.nlp.uima.corpora.treeparser.transformer.TreeTransformer; -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; - -import java.util.ArrayList; -import java.util.List; - -/** - * Collapse unaries such that the - * tree is only made of preterminals and leaves. - * - * @author Adam Gibson - */ -public class CollapseUnaries implements TreeTransformer { - - - @Override - public Tree transform(Tree tree) { - if (tree.isPreTerminal() || tree.isLeaf()) { - return tree; - } - - List children = tree.children(); - while (children.size() == 1 && !children.get(0).isLeaf()) { - children = children.get(0).children(); - } - - List processed = new ArrayList<>(); - for (Tree child : children) - processed.add(transform(child)); - - Tree ret = new Tree(tree); - ret.connect(processed); - - return ret; - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/HeadWordFinder.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/HeadWordFinder.java deleted file mode 100644 index 01ba8feaf..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/HeadWordFinder.java +++ /dev/null @@ -1,160 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.treeparser; - -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; - -import java.util.*; - -public class HeadWordFinder { - - static final String[] head1 = {"ADJP JJ", "ADJP JJR", "ADJP JJS", "ADVP RB", "ADVP RBB", "LST LS", "NAC NNS", - "NAC NN", "NAC PRP", "NAC NNPS", "NAC NNP", "NX NNS", "NX NN", "NX PRP", "NX NNPS", "NX NNP", - "NP NNS", "NP NN", "NP PRP", "NP NNPS", "NP NNP", "NP POS", "NP $", "PP IN", "PP TO", "PP RP", - "PRT RP", "S VP", "S1 S", "SBAR IN", "SBAR WHNP", "SBARQ SQ", "SBARQ VP", "SINV VP", "SQ MD", - "SQ AUX", "VP VB", "VP VBZ", "VP VBP", "VP VBG", "VP VBN", "VP VBD", "VP AUX", "VP AUXG", "VP TO", - "VP MD", "WHADJP WRB", "WHADVP WRB", "WHNP WP", "WHNP WDT", "WHNP WP$", "WHPP IN", "WHPP TO"}; - - static final String[] head2 = {"ADJP VBN", "ADJP RB", "NAC NP", "NAC CD", "NAC FW", "NAC ADJP", "NAC JJ", "NX NP", - "NX CD", "NX FW", "NX ADJP", "NX JJ", "NP CD", "NP ADJP", "NP JJ", "S SINV", "S SBARQ", "S X", - "PRT RB", "PRT IN", "SBAR WHADJP", "SBAR WHADVP", "SBAR WHPP", "SBARQ S", "SBARQ SINV", "SBARQ X", - "SINV SBAR", "SQ VP"}; - - static final String[] term = {"AUX", "AUXG", "CC", "CD", "DT", "EX", "FW", "IN", "JJ", "JJR", "JJS", "LS", "MD", - "NN", "NNS", "NNP", "NNPS", "PDT", "POS", "PRP", "PRP$", "RB", "RBR", "RBS", "RP", "SYM", "TO", - "UH", "VB", "VBD", "VBG", "VBN", "VBP", "VBZ", "WDT", "WP", "WP$", "WRB", "#", "$", ".", ",", ":", - "-RRB-", "-LRB-", "``", "''", "EOS"}; - - static final String[] punc = {"#", "$", ".", ",", ":", "-RRB-", "-LRB-", "``", "''"}; - - static Set headRules1; - - static Set headRules2; - - static Set terminals; - - static Set punctuations; - - static Map cache; - - static Boolean setsInitialized = false; - - static void buildSets() { - synchronized (setsInitialized) { - if (setsInitialized) - return; - HeadWordFinder.headRules1 = new HashSet<>(Arrays.asList(HeadWordFinder.head1)); - HeadWordFinder.headRules2 = new HashSet<>(Arrays.asList(HeadWordFinder.head2)); - HeadWordFinder.terminals = new HashSet<>(Arrays.asList(HeadWordFinder.term)); - HeadWordFinder.punctuations = new HashSet<>(Arrays.asList(HeadWordFinder.punc)); - HeadWordFinder.cache = new HashMap<>(); - setsInitialized = true; - } - } - - - boolean includePPHead; - - public HeadWordFinder(boolean includePPHead) { - this.includePPHead = includePPHead; - HeadWordFinder.buildSets(); - } - - public HeadWordFinder() { - this(false); - } - - - /** - * Finds the bottom most head - * @param parentNode the bottom most head - * @return the bottom most head (no children) for the given parent - */ - public Tree findHead(Tree parentNode) { - Tree cursor = parentNode.getType().equals("TOP") ? parentNode.firstChild() : parentNode; - - while (cursor.children() != null && !cursor.children().isEmpty()) - cursor = findHead2(cursor); - - return cursor; - } - - public Tree findHead2(Tree parentNode) { - List childNodes = parentNode.children(); - List childTypes = new ArrayList<>(childNodes.size()); - - String parentType = parentNode.getType(); - - for (Tree childNode : childNodes) - childTypes.add(childNode.getType()); - - int headIndex = findHead3(parentType, childTypes); - - return childNodes.get(headIndex); - } - - int findHead3(String lhs, List rhss) { - StringBuilder keyBuffer = new StringBuilder(lhs + " ->"); - for (String rhs : rhss) - keyBuffer.append(" " + rhs); - String key = keyBuffer.toString(); - - synchronized (HeadWordFinder.cache) { - if (cache.containsKey(key)) { - return cache.get(key); - } - } - - int currentBestGuess = -1; - int currentGuessUncertainty = 10; - - for (int current = 0; current < rhss.size(); current++) { - String rhs = rhss.get(current); - String rule = lhs + " " + rhs; - - if (currentGuessUncertainty >= 1 && headRules1.contains(rule)) { - currentBestGuess = current; - currentGuessUncertainty = 1; - } else if (currentGuessUncertainty > 2 && lhs != null && lhs.equals(rhs)) { - currentBestGuess = current; - currentGuessUncertainty = 2; - } else if (currentGuessUncertainty >= 3 && headRules2.contains(rule)) { - currentBestGuess = current; - currentGuessUncertainty = 3; - } else if (currentGuessUncertainty >= 5 && !terminals.contains(rhs) && rhs != null && !rhs.equals("PP")) { - currentBestGuess = current; - currentGuessUncertainty = 5; - } else if (currentGuessUncertainty >= 6 && !terminals.contains(rhs)) { - currentBestGuess = current; - currentGuessUncertainty = 6; - } else if (currentGuessUncertainty >= 7) { - currentBestGuess = current; - currentGuessUncertainty = 7; - } - } - - synchronized (HeadWordFinder.cache) { - cache.put(key, currentBestGuess); - } - - return currentBestGuess; - } - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeFactory.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeFactory.java deleted file mode 100644 index 0d5c6e4c8..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeFactory.java +++ /dev/null @@ -1,187 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.treeparser; - - -import org.apache.uima.fit.util.FSCollectionFactory; -import org.apache.uima.fit.util.JCasUtil; -import org.apache.uima.jcas.tcas.Annotation; -import org.cleartk.syntax.constituent.type.TreebankNode; -import org.cleartk.syntax.constituent.type.TreebankNodeUtil; -import org.cleartk.token.type.Token; -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; -import org.nd4j.common.collection.MultiDimensionalMap; -import org.nd4j.common.primitives.Pair; - -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; - -/** - * - * Static movingwindow class handling the conversion of - * treebank nodes to Trees useful - * for recursive neural tensor networks - * - * @author Adam Gibson - */ -public class TreeFactory { - - - private TreeFactory() {} - - /** - * Builds a tree recursively - * adding the children as necessary - * @param node the node to build the tree based on - * @param labels the labels to assign for each span - * @return the compiled tree with all of its children - * and childrens' children recursively - * @throws Exception - */ - public static Tree buildTree(TreebankNode node, Pair> labels, - List possibleLabels) throws Exception { - if (node.getLeaf()) - return toTree(node); - else { - List preChildren = children(node); - List children = new ArrayList<>(); - Tree t = toTree(node); - for (Pair interval : labels.getSecond().keySet()) { - if (inRange(interval.getFirst(), interval.getSecond(), t)) { - t.setGoldLabel(possibleLabels - .indexOf(labels.getSecond().get(interval.getFirst(), interval.getSecond()))); - break; - } - } - - for (int i = 0; i < preChildren.size(); i++) { - children.add(buildTree(preChildren.get(i))); - } - - t.connect(children); - return t; - - } - } - - /** - * Converts a treebank node to a tree - * @param node the node to convert - * @param labels the labels to assign for each span - * @return the tree with the same tokens and type as - * the given tree bank node - * @throws Exception - */ - public static Tree toTree(TreebankNode node, Pair> labels) - throws Exception { - List tokens = tokens(node); - Tree ret = new Tree(tokens); - ret.setValue(node.getNodeValue()); - ret.setLabel(node.getNodeType()); - ret.setType(node.getNodeType()); - ret.setBegin(node.getBegin()); - ret.setEnd(node.getEnd()); - ret.setParse(TreebankNodeUtil.toTreebankString(node)); - if (node.getNodeTags() != null) - ret.setTags(tags(node)); - else - ret.setTags(Arrays.asList(node.getNodeType())); - return ret; - } - - - - /** - * Builds a tree recursively - * adding the children as necessary - * @param node the node to build the tree based on - * @return the compiled tree with all of its children - * and childrens' children recursively - * @throws Exception - */ - public static Tree buildTree(TreebankNode node) throws Exception { - if (node.getLeaf()) - return toTree(node); - else { - List preChildren = children(node); - List children = new ArrayList<>(); - Tree t = toTree(node); - for (int i = 0; i < preChildren.size(); i++) { - children.add(buildTree(preChildren.get(i))); - } - - t.connect(children); - return t; - - } - - - - } - - /** - * Converts a treebank node to a tree - * @param node the node to convert - * @return the tree with the same tokens and type as - * the given tree bank node - * @throws Exception - */ - public static Tree toTree(TreebankNode node) throws Exception { - List tokens = tokens(node); - Tree ret = new Tree(tokens); - ret.setValue(node.getNodeValue()); - ret.setLabel(node.getNodeType()); - ret.setType(node.getNodeType()); - ret.setBegin(node.getBegin()); - ret.setEnd(node.getEnd()); - ret.setParse(TreebankNodeUtil.toTreebankString(node)); - if (node.getNodeTags() != null) - ret.setTags(tags(node)); - else - ret.setTags(Arrays.asList(node.getNodeType())); - return ret; - } - - - private static List tags(TreebankNode node) { - List ret = new ArrayList<>(); - for (int i = 0; i < node.getNodeTags().size(); i++) - ret.add(node.getNodeTags(i)); - return ret; - } - - - private static List children(TreebankNode node) { - return new ArrayList<>(FSCollectionFactory.create(node.getChildren(), TreebankNode.class)); - } - - private static List tokens(Annotation ann) throws Exception { - List ret = new ArrayList<>(); - for (Token t : JCasUtil.select(ann.getCAS().getJCas(), Token.class)) { - ret.add(t.getCoveredText()); - } - return ret; - } - - private static boolean inRange(int begin, int end, Tree tree) { - return tree.getBegin() >= begin && tree.getEnd() <= end; - } - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeIterator.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeIterator.java deleted file mode 100644 index da9577421..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeIterator.java +++ /dev/null @@ -1,116 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.treeparser; - -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; -import org.deeplearning4j.text.sentenceiterator.labelaware.LabelAwareSentenceIterator; - -import java.util.ArrayList; -import java.util.Iterator; -import java.util.List; - -/** - * Tree iterator: iterate over sentences - * returning trees with labels and everything already - * preset - * - * @author Adam Gibson - */ -public class TreeIterator implements Iterator> { - - private LabelAwareSentenceIterator sentenceIterator; - private List labels; - private TreeVectorizer treeVectorizer; - private int batchSize = 3; - - - - public TreeIterator(LabelAwareSentenceIterator sentenceIterator, List labels, TreeVectorizer treeVectorizer, - int batchSize) { - this.sentenceIterator = sentenceIterator; - this.labels = labels; - this.treeVectorizer = treeVectorizer; - this.batchSize = batchSize; - } - - public TreeIterator(LabelAwareSentenceIterator sentenceIterator, List labels, - TreeVectorizer treeVectorizer) { - this.sentenceIterator = sentenceIterator; - this.labels = labels; - this.treeVectorizer = treeVectorizer; - batchSize = labels.size(); - } - - public TreeIterator(LabelAwareSentenceIterator sentenceIterator, List labels) throws Exception { - this(sentenceIterator, labels, new TreeVectorizer()); - } - - - /** - * Returns {@code true} if the iteration has more elements. - * (In other words, returns {@code true} if {@link #next} would - * return an element rather than throwing an exception.) - * - * @return {@code true} if the iteration has more elements - */ - @Override - public boolean hasNext() { - return sentenceIterator.hasNext(); - } - - /** - * Returns the next element in the iteration. - * - * @return the next element in the iteration - */ - @Override - public List next() { - List ret = new ArrayList<>(); - try { - for (int i = 0; i < batchSize; i++) - if (hasNext()) - ret.addAll(treeVectorizer.getTreesWithLabels(sentenceIterator.nextSentence(), - sentenceIterator.currentLabel(), labels)); - } catch (Exception e) { - throw new RuntimeException(e); - } - - return ret; - } - - /** - * Removes from the underlying collection the last element returned - * by this iterator (optional operation). This method can be called - * only once per call to {@link #next}. The behavior of an iterator - * is unspecified if the underlying collection is modified while the - * iteration is in progress in any way other than by calling this - * method. - * - * @throws UnsupportedOperationException if the {@code remove} - * operation is not supported by this iterator - * @throws IllegalStateException if the {@code next} method has not - * yet been called, or the {@code remove} method has already - * been called after the last call to the {@code next} - * method - */ - @Override - public void remove() { - throw new UnsupportedOperationException(); - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeParser.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeParser.java deleted file mode 100644 index 39fd86860..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeParser.java +++ /dev/null @@ -1,422 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.treeparser; - -import org.apache.uima.analysis_engine.AnalysisEngine; -import org.apache.uima.cas.CAS; -import org.apache.uima.fit.util.JCasUtil; -import org.apache.uima.util.CasPool; -import org.cleartk.opennlp.tools.ParserAnnotator; -import org.cleartk.opennlp.tools.parser.DefaultOutputTypesHelper; -import org.cleartk.syntax.constituent.type.TopTreebankNode; -import org.cleartk.syntax.constituent.type.TreebankNode; -import org.cleartk.token.type.Sentence; -import org.cleartk.token.type.Token; -import org.cleartk.util.ParamUtil; -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; -import org.deeplearning4j.nlp.uima.annotator.PoStagger; -import org.deeplearning4j.nlp.uima.annotator.SentenceAnnotator; -import org.deeplearning4j.nlp.uima.annotator.StemmerAnnotator; -import org.deeplearning4j.nlp.uima.annotator.TokenizerAnnotator; -import org.deeplearning4j.text.movingwindow.ContextLabelRetriever; -import org.deeplearning4j.text.sentenceiterator.SentencePreProcessor; -import org.deeplearning4j.text.tokenization.tokenizerfactory.TokenizerFactory; -import org.deeplearning4j.nlp.uima.tokenization.tokenizerfactory.UimaTokenizerFactory; -import org.nd4j.common.collection.MultiDimensionalMap; -import org.nd4j.common.primitives.Pair; -import org.nd4j.common.util.SetUtils; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.util.ArrayList; -import java.util.Collection; -import java.util.List; -import java.util.Set; - -import static org.apache.uima.fit.factory.AnalysisEngineFactory.createEngine; -import static org.apache.uima.fit.factory.AnalysisEngineFactory.createEngineDescription; - -/** - * Tree parser for constituency parsing - * - * @author Adam Gibson - */ -public class TreeParser { - - private AnalysisEngine parser; - private AnalysisEngine tokenizer; - private CasPool pool; - private static final Logger log = LoggerFactory.getLogger(TreeParser.class); - private TokenizerFactory tf; - - - public TreeParser(AnalysisEngine parser, AnalysisEngine tokenizer, CasPool pool) { - this.parser = parser; - this.tokenizer = tokenizer; - this.pool = pool; - tf = new UimaTokenizerFactory(tokenizer, true); - } - - - public TreeParser() throws Exception { - if (parser == null) { - parser = getParser(); - } - if (tokenizer == null) - tokenizer = getTokenizer(); - if (pool == null) - pool = new CasPool(Runtime.getRuntime().availableProcessors(), parser); - tf = new UimaTokenizerFactory(tokenizer, true); - - } - - /** - * Gets trees from text. - * First a sentence segmenter is used to segment the training examples in to sentences. - * Sentences are then turned in to trees and returned. - * @param text the text to process - * @param preProcessor the pre processor to use for pre processing sentences - * @return the list of trees - * @throws Exception - */ - public List getTrees(String text, SentencePreProcessor preProcessor) throws Exception { - if (text.isEmpty()) - return new ArrayList<>(); - - CAS c = pool.getCas(); - if (preProcessor != null) - text = preProcessor.preProcess(text); - - - c.setDocumentText(text); - tokenizer.process(c); - List ret = new ArrayList<>(); - CAS c2 = pool.getCas(); - List>> list = new ArrayList<>(); - for (Sentence sentence : JCasUtil.select(c.getJCas(), Sentence.class)) { - List tokens = new ArrayList<>(); - for (Token t : JCasUtil.selectCovered(Token.class, sentence)) - tokens.add(t.getCoveredText()); - - Pair> p = - ContextLabelRetriever.stringWithLabels(sentence.getCoveredText(), tf); - c2.setDocumentText(p.getFirst()); - list.add(p); - tokenizer.process(c2); - parser.process(c2); - - //build the tree based on this - TopTreebankNode node = JCasUtil.selectSingle(c.getJCas(), TopTreebankNode.class); - ret.add(TreeFactory.buildTree(node)); - - - } - - pool.releaseCas(c2); - - - for (Tree t : ret) { - addPreTerminal(t); - } - - - return ret; - - - } - - - private void addPreTerminal(Tree t) { - if (t.isLeaf()) { - Tree newLeaf = new Tree(t); - newLeaf.setLabel(t.value()); - t.children().add(newLeaf); - newLeaf.setParent(t); - } else { - for (Tree child : t.children()) - addPreTerminal(child); - } - } - - /** - * Gets trees from text. - * First a sentence segmenter is used to segment the training examples in to sentences. - * Sentences are then turned in to trees and returned. - * @param text the text to process - * @return the list of trees - * @throws Exception - */ - public List getTreebankTrees(String text) throws Exception { - if (text.isEmpty()) - return new ArrayList<>(); - - CAS c = pool.getCas(); - c.setDocumentText(text); - tokenizer.process(c); - List ret = new ArrayList<>(); - for (Sentence sentence : JCasUtil.select(c.getJCas(), Sentence.class)) { - List tokens = new ArrayList<>(); - CAS c2 = tokenizer.newCAS(); - - for (Token t : JCasUtil.selectCovered(Token.class, sentence)) - tokens.add(t.getCoveredText()); - - - c2.setDocumentText(sentence.getCoveredText()); - tokenizer.process(c2); - parser.process(c2); - - //build the tree based on this - TopTreebankNode node = JCasUtil.selectSingle(c2.getJCas(), TopTreebankNode.class); - - - ret.add(node); - - - } - - pool.releaseCas(c); - - return ret; - - - } - - /** - * Gets trees from text. - * First a sentence segmenter is used to segment the training examples in to sentences. - * Sentences are then turned in to trees and returned. - * - * This will also process sentences with the following label format: - * some text - * - * This will allow you to iterate on and label sentences and label spans yourself. - * - * @param text the text to process - * @param label the label for the whole sentence - * @param labels the possible labels for the sentence - * @return the list of trees - * @throws Exception - */ - public List getTreesWithLabels(String text, String label, List labels) throws Exception { - if (text.isEmpty()) - return new ArrayList<>(); - CAS c = pool.getCas(); - c.setDocumentText("<" + label + "> " + text + " "); - tokenizer.process(c); - List lowerCaseLabels = new ArrayList<>(); - for (String s : labels) - lowerCaseLabels.add(s.toLowerCase()); - labels = lowerCaseLabels; - - List ret = new ArrayList<>(); - CAS c2 = pool.getCas(); - for (Sentence sentence : JCasUtil.select(c.getJCas(), Sentence.class)) { - if (sentence.getCoveredText().isEmpty()) - continue; - - List tokens = new ArrayList<>(); - for (Token t : JCasUtil.selectCovered(Token.class, sentence)) - tokens.add(t.getCoveredText()); - - try { - Pair> stringsWithLabels = - ContextLabelRetriever.stringWithLabels(sentence.getCoveredText(), tf); - c2.setDocumentText(stringsWithLabels.getFirst()); - tokenizer.process(c2); - parser.process(c2); - - //build the tree based on this - List nodes = new ArrayList<>(JCasUtil.select(c2.getJCas(), TopTreebankNode.class)); - if (nodes.size() > 1) { - log.warn("More than one top level node for a treebank parse. Only accepting first input node."); - } - - else if (nodes.isEmpty()) { - c2.reset(); - continue; - } - - - - TopTreebankNode node = nodes.get(0); - ret.add(TreeFactory.buildTree(node, stringsWithLabels, labels)); - c2.reset(); - - } catch (Exception e) { - log.warn("Unable to parse " + sentence.getCoveredText()); - c2.reset(); - continue; - } - - - - } - - pool.releaseCas(c); - pool.releaseCas(c2); - - return ret; - - - } - - - /** - * Gets trees from text. - * First a sentence segmenter is used to segment the training examples in to sentences. - * Sentences are then turned in to trees and returned. - * - * This will also process sentences with the following label format: - * some text - * - * This will allow you to iterate on and label sentences and label spans yourself. - * - * @param text the text to process - * @param labels - * @return the list of trees - * @throws Exception - */ - public List getTreesWithLabels(String text, List labels) throws Exception { - CAS c = pool.getCas(); - c.setDocumentText(text); - tokenizer.process(c); - List lowerCaseLabels = new ArrayList<>(); - for (String s : labels) - lowerCaseLabels.add(s.toLowerCase()); - labels = lowerCaseLabels; - - List ret = new ArrayList<>(); - CAS c2 = pool.getCas(); - for (Sentence sentence : JCasUtil.select(c.getJCas(), Sentence.class)) { - List tokens = new ArrayList<>(); - for (Token t : JCasUtil.selectCovered(Token.class, sentence)) - tokens.add(t.getCoveredText()); - - Pair> stringsWithLabels = - ContextLabelRetriever.stringWithLabels(sentence.getCoveredText(), tf); - c2.setDocumentText(stringsWithLabels.getFirst()); - - - - tokenizer.process(c2); - parser.process(c2); - - //build the tree based on this - //damn it - List nodes = new ArrayList<>(JCasUtil.select(c2.getJCas(), TopTreebankNode.class)); - if (nodes.size() > 1) { - log.warn("More than one top level node for a treebank parse. Only accepting first input node."); - } - - else if (nodes.isEmpty()) { - c2.reset(); - continue; - } - - - Collection labels2 = stringsWithLabels.getSecond().values(); - Set diff = SetUtils.difference(labels2, labels); - if (!diff.isEmpty()) { - log.warn("Found invalid sentence. Skipping"); - c2.reset(); - continue; - - } - - TopTreebankNode node = nodes.get(0); - ret.add(TreeFactory.buildTree(node, stringsWithLabels, labels)); - c2.reset(); - - } - - pool.releaseCas(c); - pool.releaseCas(c2); - - return ret; - - - } - - /** - * Gets trees from text. - * First a sentence segmenter is used to segment the training examples in to sentences. - * Sentences are then turned in to trees and returned. - * @param text the text to process - * @return the list of trees - * @throws Exception - */ - public List getTrees(String text) throws Exception { - CAS c = pool.getCas(); - c.setDocumentText(text); - tokenizer.process(c); - List ret = new ArrayList<>(); - CAS c2 = pool.getCas(); - for (Sentence sentence : JCasUtil.select(c.getJCas(), Sentence.class)) { - List tokens = new ArrayList<>(); - for (Token t : JCasUtil.selectCovered(Token.class, sentence)) - tokens.add(t.getCoveredText()); - - - c2.setDocumentText(sentence.getCoveredText()); - tokenizer.process(c2); - parser.process(c2); - - //build the tree based on this - TopTreebankNode node = JCasUtil.selectSingle(c2.getJCas(), TopTreebankNode.class); - log.info("Tree bank parse " + node.getTreebankParse()); - for (TreebankNode node2 : JCasUtil.select(c2.getJCas(), TreebankNode.class)) { - log.info("Node val " + node2.getNodeValue() + " and label " + node2.getNodeType() + " and tags was " - + node2.getNodeTags()); - } - - ret.add(TreeFactory.buildTree(node)); - c2.reset(); - - } - - pool.releaseCas(c); - pool.releaseCas(c2); - - return ret; - - - } - - - public static AnalysisEngine getTokenizer() throws Exception { - return createEngine( - createEngineDescription(SentenceAnnotator.getDescription(), TokenizerAnnotator.getDescription(), - PoStagger.getDescription("en"), StemmerAnnotator.getDescription("English") - - )); - } - - public static AnalysisEngine getParser() throws Exception { - return createEngine(createEngineDescription(createEngineDescription(ParserAnnotator.class, - ParserAnnotator.PARAM_USE_TAGS_FROM_CAS, true, ParserAnnotator.PARAM_PARSER_MODEL_PATH, - ParamUtil.getParameterValue(ParserAnnotator.PARAM_PARSER_MODEL_PATH, - "/models/en-parser-chunking.bin"), - ParserAnnotator.PARAM_OUTPUT_TYPES_HELPER_CLASS_NAME, - DefaultOutputTypesHelper.class.getName()))); - - - } - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeVectorizer.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeVectorizer.java deleted file mode 100644 index e33b4027a..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/TreeVectorizer.java +++ /dev/null @@ -1,131 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.treeparser; - -import org.deeplearning4j.nlp.uima.corpora.treeparser.transformer.TreeTransformer; -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; - -import java.util.ArrayList; -import java.util.List; - -/** - * Tree vectorization pipeline. Takes a word vector model (as a lookup table) - * and a parser and handles vectorization of strings appropriate for an RNTN - * - * @author Adam Gibson - */ -public class TreeVectorizer { - - private TreeParser parser; - private TreeTransformer treeTransformer = new BinarizeTreeTransformer(); - private TreeTransformer cnfTransformer = new CollapseUnaries(); - - /** - * Uses the given parser and model - * for vectorization of strings - * @param parser the parser to use for converting - * strings to trees - */ - public TreeVectorizer(TreeParser parser) { - this.parser = parser; - } - - /** - * Uses word vectors from the passed in word2vec model - * @throws Exception - */ - public TreeVectorizer() throws Exception { - this(new TreeParser()); - } - - /** - * Vectorizes the passed in sentences - * @param sentences the sentences to convert to trees - * @return a list of trees pre converted with CNF and - * binarized and word vectors at the leaves of the trees - * - * @throws Exception - */ - public List getTrees(String sentences) throws Exception { - List ret = new ArrayList<>(); - List baseTrees = parser.getTrees(sentences); - for (Tree t : baseTrees) { - Tree binarized = treeTransformer.transform(t); - binarized = cnfTransformer.transform(binarized); - ret.add(binarized); - } - - return ret; - - } - - - /** - * Vectorizes the passed in sentences - * @param sentences the sentences to convert to trees - * @param label the label for the sentence - * @param labels all of the possible labels for the trees - * @return a list of trees pre converted with CNF and - * binarized and word vectors at the leaves of the trees - * - * @throws Exception - */ - public List getTreesWithLabels(String sentences, String label, List labels) throws Exception { - List ret = new ArrayList<>(); - List baseTrees = parser.getTreesWithLabels(sentences, label, labels); - for (Tree t : baseTrees) { - Tree binarized = treeTransformer.transform(t); - binarized = cnfTransformer.transform(binarized); - ret.add(binarized); - } - - return ret; - - } - - - - /** - * Vectorizes the passed in sentences - * @param sentences the sentences to convert to trees - * @param labels all of the possible labels for the trees - * @return a list of trees pre converted with CNF and - * binarized and word vectors at the leaves of the trees - * - * @throws Exception - */ - public List getTreesWithLabels(String sentences, List labels) throws Exception { - List realLabels = new ArrayList<>(labels); - if (!realLabels.contains("NONE")) - realLabels.add("NONE"); - List ret = new ArrayList<>(); - List baseTrees = parser.getTreesWithLabels(sentences, realLabels); - for (Tree t : baseTrees) { - Tree binarized = treeTransformer.transform(t); - binarized = cnfTransformer.transform(binarized); - ret.add(binarized); - } - - return ret; - - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/transformer/TreeTransformer.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/transformer/TreeTransformer.java deleted file mode 100644 index 6bfe51aef..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/corpora/treeparser/transformer/TreeTransformer.java +++ /dev/null @@ -1,37 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.corpora.treeparser.transformer; - -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; - -/** - * Tree transformer - * @author Adam Gibson - */ -public interface TreeTransformer { - - /** - * Applies a applyTransformToOrigin to a tree - * @param t the tree to applyTransformToOrigin - * @return the transformed tree - */ - Tree transform(Tree t); - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/UimaResultSetIterator.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/UimaResultSetIterator.java deleted file mode 100644 index f6c3fd9c2..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/UimaResultSetIterator.java +++ /dev/null @@ -1,146 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.sentenceiterator; - -import org.apache.uima.cas.CAS; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.fit.util.JCasUtil; -import org.apache.uima.resource.ResourceInitializationException; -import org.cleartk.token.type.Sentence; -import org.deeplearning4j.nlp.uima.annotator.SentenceAnnotator; -import org.deeplearning4j.nlp.uima.annotator.TokenizerAnnotator; -import org.deeplearning4j.text.sentenceiterator.BasicResultSetIterator; -import org.deeplearning4j.nlp.uima.uima.UimaResource; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.sql.ResultSet; -import java.util.ArrayList; -import java.util.Iterator; -import java.util.List; - -/** - * Iterates over and returns sentences - * based on the passed in analysis engine - * - * Database version of UimaSentenceIterator based off Adam Gibson's UimaSentenceIterator but extends BasicResultSetIterator - * - * Please note: for reset functionality, the underlying JDBC ResultSet must not be of TYPE_FORWARD_ONLY - * To achieve this using postgres you can make your query using: - * connection.prepareStatement(sql,ResultSet.TYPE_SCROLL_INSENSITIVE,ResultSet.CONCUR_READ_ONLY); - * - * @author Brad Heap nzv8fan@gmail.com - */ -public class UimaResultSetIterator extends BasicResultSetIterator { - - private UimaResource resource; - protected volatile Iterator sentences; - private static final Logger log = LoggerFactory.getLogger(UimaSentenceIterator.class); - - /** - * Constructor which builds a new UimaResource object - * @param rs the database result set object to iterate over - * @param columnName the name of the column containing text - * @throws ResourceInitializationException - */ - public UimaResultSetIterator(ResultSet rs, String columnName) throws ResourceInitializationException { - this(rs, columnName, - new UimaResource(AnalysisEngineFactory.createEngine(AnalysisEngineFactory - .createEngineDescription(TokenizerAnnotator.getDescription(), - SentenceAnnotator.getDescription())))); - } - - /** - * Constructor which takes an existing UimaResource object - * @param rs the database result set object to iterate over - * @param columnName the name of the column containing text - * @param resource - */ - public UimaResultSetIterator(ResultSet rs, String columnName, UimaResource resource) { - super(rs, columnName); - this.resource = resource; - } - - @Override - public synchronized String nextSentence() { - - if (sentences == null || !sentences.hasNext()) { - // if we have no sentence get the next row from the database - try { - String text = super.nextSentence(); - - if (text == null) - return ""; - - CAS cas = resource.retrieve(); - cas.setDocumentText(text); - // log.info("Document text: " + text); - - resource.getAnalysisEngine().process(cas); - - List list = new ArrayList<>(); - for (Sentence sentence : JCasUtil.select(cas.getJCas(), Sentence.class)) { - list.add(sentence.getCoveredText()); - } - - sentences = list.iterator(); - - String ret = sentences.next(); - if (this.getPreProcessor() != null) - ret = this.getPreProcessor().preProcess(ret); - // log.info("Sentence text: " + ret); - return ret; - - } catch (Exception e) { - throw new RuntimeException(e); - } - - } else { - String ret = sentences.next(); - if (this.getPreProcessor() != null) - ret = this.getPreProcessor().preProcess(ret); - // log.info("Sentence text: " + ret); - return ret; - } - } - - @Override - public synchronized boolean hasNext() { - try { - if (sentences != null && sentences.hasNext()) - return true; - return super.hasNext(); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public void reset() { - sentences = null; - super.reset(); - } - - @Override - public void finish() { - sentences = null; - super.finish(); - } - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/UimaSentenceIterator.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/UimaSentenceIterator.java deleted file mode 100644 index 3c363fe89..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/UimaSentenceIterator.java +++ /dev/null @@ -1,227 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.sentenceiterator; - -import org.apache.uima.analysis_engine.AnalysisEngine; -import org.apache.uima.cas.CAS; -import org.apache.uima.collection.CollectionReader; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.fit.util.JCasUtil; -import org.apache.uima.resource.ResourceInitializationException; -import org.cleartk.token.type.Sentence; -import org.cleartk.util.cr.FilesCollectionReader; -import org.deeplearning4j.nlp.uima.annotator.SentenceAnnotator; -import org.deeplearning4j.nlp.uima.annotator.TokenizerAnnotator; -import org.deeplearning4j.text.sentenceiterator.BaseSentenceIterator; -import org.deeplearning4j.text.sentenceiterator.SentenceIterator; -import org.deeplearning4j.text.sentenceiterator.SentencePreProcessor; -import org.deeplearning4j.nlp.uima.uima.UimaResource; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.io.File; -import java.util.ArrayList; -import java.util.Iterator; -import java.util.List; - -/** - * Iterates over and returns sentences - * based on the passed in analysis engine - * @author Adam Gibson - * - */ -public class UimaSentenceIterator extends BaseSentenceIterator { - - protected volatile CollectionReader reader; - protected volatile Iterator sentences; - protected String path; - private static final Logger log = LoggerFactory.getLogger(UimaSentenceIterator.class); - private static AnalysisEngine defaultAnalysisEngine; - private UimaResource resource; - - - public UimaSentenceIterator(SentencePreProcessor preProcessor, String path, UimaResource resource) { - super(preProcessor); - this.path = path; - File f = new File(path); - if (f.isFile()) { - - //more than a kilobyte break up the file (only do this for files - - - try { - - this.reader = FilesCollectionReader.getCollectionReader(path); - } catch (Exception e) { - throw new RuntimeException(e); - } - - - - } else { - try { - this.reader = FilesCollectionReader.getCollectionReader(path); - } catch (ResourceInitializationException e) { - throw new RuntimeException(e); - } - } - - this.resource = resource; - } - - - public UimaSentenceIterator(SentencePreProcessor preProcessor, CollectionReader cr, UimaResource resource) { - super(preProcessor); - this.reader = cr; - this.resource = resource; - } - - - public UimaSentenceIterator(String path, UimaResource resource) { - this(null, path, resource); - } - - @Override - public synchronized String nextSentence() { - if (sentences == null || !sentences.hasNext()) { - try { - if (getReader().hasNext()) { - CAS cas = resource.retrieve(); - - try { - getReader().getNext(cas); - } catch (Exception e) { - log.warn("Done iterating returning an empty string"); - return ""; - } - - - resource.getAnalysisEngine().process(cas); - - - - List list = new ArrayList<>(); - for (Sentence sentence : JCasUtil.select(cas.getJCas(), Sentence.class)) { - list.add(sentence.getCoveredText()); - } - - - sentences = list.iterator(); - //needs to be next cas - while (!sentences.hasNext()) { - //sentence is empty; go to another cas - if (reader.hasNext()) { - cas.reset(); - getReader().getNext(cas); - resource.getAnalysisEngine().process(cas); - for (Sentence sentence : JCasUtil.select(cas.getJCas(), Sentence.class)) { - list.add(sentence.getCoveredText()); - } - sentences = list.iterator(); - } else - return null; - } - - - String ret = sentences.next(); - if (this.getPreProcessor() != null) - ret = this.getPreProcessor().preProcess(ret); - return ret; - } - - return null; - - } catch (Exception e) { - throw new RuntimeException(e); - } - - } else { - String ret = sentences.next(); - if (this.getPreProcessor() != null) - ret = this.getPreProcessor().preProcess(ret); - return ret; - } - - - - } - - public UimaResource getResource() { - return resource; - } - - /** - * Creates a uima sentence iterator with the given path - * @param path the path to the root directory or file to read from - * @return the uima sentence iterator for the given root dir or file - * @throws Exception - */ - public static SentenceIterator createWithPath(String path) throws Exception { - return new UimaSentenceIterator(path, - new UimaResource(AnalysisEngineFactory.createEngine(AnalysisEngineFactory - .createEngineDescription(TokenizerAnnotator.getDescription(), - SentenceAnnotator.getDescription())))); - } - - - @Override - public synchronized boolean hasNext() { - try { - return getReader().hasNext() || sentences != null && sentences.hasNext(); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - - private synchronized CollectionReader getReader() { - return reader; - } - - - @Override - public void reset() { - try { - this.reader = FilesCollectionReader.getCollectionReader(path); - } catch (ResourceInitializationException e) { - throw new RuntimeException(e); - } - } - - - /** - * Return a sentence segmenter - * @return a sentence segmenter - */ - public static AnalysisEngine segmenter() { - try { - if (defaultAnalysisEngine == null) - - defaultAnalysisEngine = AnalysisEngineFactory.createEngine( - AnalysisEngineFactory.createEngineDescription(SentenceAnnotator.getDescription())); - - return defaultAnalysisEngine; - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/labelaware/LabelAwareUimaSentenceIterator.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/labelaware/LabelAwareUimaSentenceIterator.java deleted file mode 100644 index bf64978c1..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/sentenceiterator/labelaware/LabelAwareUimaSentenceIterator.java +++ /dev/null @@ -1,98 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.sentenceiterator.labelaware; - -import org.apache.uima.analysis_engine.AnalysisEngine; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.resource.ResourceInitializationException; -import org.deeplearning4j.nlp.uima.annotator.SentenceAnnotator; -import org.deeplearning4j.nlp.uima.annotator.TokenizerAnnotator; -import org.deeplearning4j.text.sentenceiterator.SentencePreProcessor; -import org.deeplearning4j.nlp.uima.sentenceiterator.UimaSentenceIterator; -import org.deeplearning4j.text.sentenceiterator.labelaware.LabelAwareSentenceIterator; -import org.deeplearning4j.nlp.uima.uima.UimaResource; - -import java.io.File; -import java.lang.reflect.Field; -import java.util.Arrays; -import java.util.List; - -/** - * - * Uima sentence iterator that is aware of the current file - * @author Adam Gibson - */ -public class LabelAwareUimaSentenceIterator extends UimaSentenceIterator implements LabelAwareSentenceIterator { - - public LabelAwareUimaSentenceIterator(SentencePreProcessor preProcessor, String path, UimaResource resource) { - super(preProcessor, path, resource); - } - - public LabelAwareUimaSentenceIterator(String path, AnalysisEngine engine) throws ResourceInitializationException { - super(path, new UimaResource(engine)); - } - - - /** - * Returns the current label for nextSentence() - * - * @return the label for nextSentence() - */ - @Override - public String currentLabel() { - - try { - /** - * Little bit hacky, but most concise way to do it. - * Get the parent collection reader's current file. - * The collection reader is basically a wrapper for a file iterator. - * We can use this to ge the current file for the iterator. - */ - Field f = reader.getClass().getDeclaredField("currentFile"); - f.setAccessible(true); - File file = (File) f.get(reader); - return file.getParentFile().getName(); - } - - catch (NullPointerException e1) { - return "NONE"; - } catch (Exception e) { - throw new RuntimeException(e); - } - - } - - /** - * Creates a uima sentence iterator with the given path - * @param path the path to the root directory or file to read from - * @return the uima sentence iterator for the given root dir or file - * @throws Exception - */ - public static LabelAwareSentenceIterator createWithPath(String path) throws Exception { - return new LabelAwareUimaSentenceIterator(null, path, - new UimaResource(AnalysisEngineFactory.createEngine(AnalysisEngineFactory - .createEngineDescription(TokenizerAnnotator.getDescription(), - SentenceAnnotator.getDescription())))); - } - - @Override - public List currentLabels() { - return Arrays.asList(currentLabel()); - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/ConcurrentTokenizer.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/ConcurrentTokenizer.java deleted file mode 100644 index 39fe23282..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/ConcurrentTokenizer.java +++ /dev/null @@ -1,142 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.tokenization.tokenizer; - -import opennlp.tools.tokenize.TokenizerME; -import opennlp.tools.tokenize.TokenizerModel; -import opennlp.tools.util.Span; -import opennlp.uima.tokenize.AbstractTokenizer; -import opennlp.uima.tokenize.TokenizerModelResource; -import opennlp.uima.util.AnnotatorUtil; -import opennlp.uima.util.UimaUtil; -import org.apache.uima.UimaContext; -import org.apache.uima.analysis_engine.AnalysisEngineProcessException; -import org.apache.uima.cas.CAS; -import org.apache.uima.cas.Feature; -import org.apache.uima.cas.TypeSystem; -import org.apache.uima.cas.text.AnnotationFS; -import org.apache.uima.resource.ResourceAccessException; -import org.apache.uima.resource.ResourceInitializationException; - -/** - * OpenNLP Tokenizer annotator. - *

- * Mandatory parameters - * - * - * - * - * - *
Type Name Description
String opennlp.uima.ModelName The name of the model file
String opennlp.uima.SentenceType The full name of the sentence type
String opennlp.uima.TokenType The full name of the token type
- *

- * Optional parameters - * - * - * - *
Type Name Description
String opennlp.uima.ProbabilityFeature The name of the double - * probability feature (not applyTransformToDestination by default)
- * @see {@link TokenizerME} - */ -public class ConcurrentTokenizer extends AbstractTokenizer { - - /** - * The OpenNLP tokenizer. - */ - private TokenizerME tokenizer; - - private Feature probabilityFeature; - - @Override - public synchronized void process(CAS cas) throws AnalysisEngineProcessException { - super.process(cas); - } - - /** - * Initializes a new instance. - * - * Note: Use {@link #initialize(UimaContext) } to initialize - * this instance. Not use the constructor. - */ - public ConcurrentTokenizer() { - super("OpenNLP Tokenizer"); - - // must not be implemented ! - } - - /** - * Initializes the current instance with the given context. - * - * Note: Do all initialization in this method, do not use the constructor. - */ - public void initialize(UimaContext context) throws ResourceInitializationException { - - super.initialize(context); - - TokenizerModel model; - - try { - TokenizerModelResource modelResource = - (TokenizerModelResource) context.getResourceObject(UimaUtil.MODEL_PARAMETER); - - model = modelResource.getModel(); - } catch (ResourceAccessException e) { - throw new ResourceInitializationException(e); - } - - tokenizer = new TokenizerME(model); - } - - /** - * Initializes the type system. - */ - public void typeSystemInit(TypeSystem typeSystem) throws AnalysisEngineProcessException { - - super.typeSystemInit(typeSystem); - - probabilityFeature = AnnotatorUtil.getOptionalFeatureParameter(context, tokenType, - UimaUtil.PROBABILITY_FEATURE_PARAMETER, CAS.TYPE_NAME_DOUBLE); - } - - - @Override - protected Span[] tokenize(CAS cas, AnnotationFS sentence) { - return tokenizer.tokenizePos(sentence.getCoveredText()); - } - - @Override - protected void postProcessAnnotations(Span[] tokens, AnnotationFS[] tokenAnnotations) { - // if interest - if (probabilityFeature != null) { - double tokenProbabilties[] = tokenizer.getTokenProbabilities(); - - for (int i = 0; i < tokenAnnotations.length; i++) { - tokenAnnotations[i].setDoubleValue(probabilityFeature, tokenProbabilties[i]); - } - } - } - - /** - * Releases allocated resources. - */ - public void destroy() { - // dereference model to allow garbage collection - tokenizer = null; - } -} - diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/PosUimaTokenizer.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/PosUimaTokenizer.java deleted file mode 100644 index f7a4dd0d6..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/PosUimaTokenizer.java +++ /dev/null @@ -1,153 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.tokenization.tokenizer; - -import lombok.NonNull; -import org.apache.uima.analysis_engine.AnalysisEngine; -import org.apache.uima.cas.CAS; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.fit.util.JCasUtil; -import org.cleartk.token.type.Sentence; -import org.cleartk.token.type.Token; -import org.deeplearning4j.nlp.uima.annotator.PoStagger; -import org.deeplearning4j.nlp.uima.annotator.SentenceAnnotator; -import org.deeplearning4j.nlp.uima.annotator.StemmerAnnotator; -import org.deeplearning4j.nlp.uima.annotator.TokenizerAnnotator; -import org.deeplearning4j.text.tokenization.tokenizer.TokenPreProcess; -import org.deeplearning4j.text.tokenization.tokenizer.Tokenizer; - -import java.util.ArrayList; -import java.util.Collection; -import java.util.List; - -/** - * Filter by part of speech tag. - * Any not valid part of speech tags - * become NONE - * @author Adam Gibson - * - */ -public class PosUimaTokenizer implements Tokenizer { - - private static AnalysisEngine engine; - private List tokens; - private Collection allowedPosTags; - private int index; - private static CAS cas; - private TokenPreProcess preProcessor; - private boolean stripNones = false; - - public PosUimaTokenizer(String tokens, AnalysisEngine engine, Collection allowedPosTags) { - this(tokens, engine, allowedPosTags, false); - } - - public PosUimaTokenizer(String tokens, AnalysisEngine engine, Collection allowedPosTags, - boolean stripNones) { - if (PosUimaTokenizer.engine == null) - PosUimaTokenizer.engine = engine; - this.allowedPosTags = allowedPosTags; - this.tokens = new ArrayList<>(); - this.stripNones = stripNones; - try { - if (cas == null) - cas = engine.newCAS(); - - cas.reset(); - cas.setDocumentText(tokens); - PosUimaTokenizer.engine.process(cas); - for (Sentence s : JCasUtil.select(cas.getJCas(), Sentence.class)) { - for (Token t : JCasUtil.selectCovered(Token.class, s)) { - //add NONE for each invalid token - if (valid(t)) - if (t.getLemma() != null) - this.tokens.add(t.getLemma()); - else if (t.getStem() != null) - this.tokens.add(t.getStem()); - else - this.tokens.add(t.getCoveredText()); - else - this.tokens.add("NONE"); - } - } - - - - } catch (Exception e) { - throw new RuntimeException(e); - } - - } - - private boolean valid(Token token) { - String check = token.getCoveredText(); - if (check.matches("<[A-Z]+>") || check.matches("") - || (token.getPos() != null && !this.allowedPosTags.contains(token.getPos()))) - return false; - return true; - } - - - - @Override - public boolean hasMoreTokens() { - return index < tokens.size(); - } - - @Override - public int countTokens() { - return tokens.size(); - } - - @Override - public String nextToken() { - String ret = tokens.get(index); // preProcessor != null ? preProcessor.preProcess(tokens.get(index)) : tokens.get(index); - index++; - return ret; - } - - @Override - public List getTokens() { - List tokens = new ArrayList<>(); - while (hasMoreTokens()) { - String nextT = nextToken(); - if (stripNones && nextT.equals("NONE")) - continue; - tokens.add(preProcessor != null ? preProcessor.preProcess(nextT) : nextT); - } - return tokens; - } - - public static AnalysisEngine defaultAnalysisEngine() { - try { - return AnalysisEngineFactory.createEngine(AnalysisEngineFactory.createEngineDescription( - SentenceAnnotator.getDescription(), TokenizerAnnotator.getDescription(), - PoStagger.getDescription("en"), StemmerAnnotator.getDescription("English"))); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public void setTokenPreProcessor(@NonNull TokenPreProcess tokenPreProcessor) { - this.preProcessor = tokenPreProcessor; - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/UimaTokenizer.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/UimaTokenizer.java deleted file mode 100644 index 11dced9fd..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/UimaTokenizer.java +++ /dev/null @@ -1,122 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.tokenization.tokenizer; - -import lombok.extern.slf4j.Slf4j; -import org.apache.uima.cas.CAS; -import org.apache.uima.fit.util.JCasUtil; -import org.cleartk.token.type.Token; -import org.deeplearning4j.text.tokenization.tokenizer.TokenPreProcess; -import org.deeplearning4j.text.tokenization.tokenizer.Tokenizer; -import org.deeplearning4j.nlp.uima.uima.UimaResource; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.util.ArrayList; -import java.util.Collection; -import java.util.List; - -/** - * Tokenizer based on the passed in analysis engine - * @author Adam Gibson - * - */ -@Slf4j -public class UimaTokenizer implements Tokenizer { - - private List tokens; - private int index; - private static final Logger log = LoggerFactory.getLogger(UimaTokenizer.class); - private boolean checkForLabel; - private TokenPreProcess preProcess; - - - public UimaTokenizer(String tokens, UimaResource resource, boolean checkForLabel) { - - this.checkForLabel = checkForLabel; - this.tokens = new ArrayList<>(); - try { - CAS cas = resource.process(tokens); - - Collection tokenList = JCasUtil.select(cas.getJCas(), Token.class); - - for (Token t : tokenList) { - - if (!checkForLabel || valid(t.getCoveredText())) - if (t.getLemma() != null) - this.tokens.add(t.getLemma()); - else if (t.getStem() != null) - this.tokens.add(t.getStem()); - else - this.tokens.add(t.getCoveredText()); - } - - - resource.release(cas); - - - } catch (Exception e) { - log.error("",e); - throw new RuntimeException(e); - } - - } - - private boolean valid(String check) { - return !(check.matches("<[A-Z]+>") || check.matches("")); - } - - - - @Override - public boolean hasMoreTokens() { - return index < tokens.size(); - } - - @Override - public int countTokens() { - return tokens.size(); - } - - @Override - public String nextToken() { - String ret = tokens.get(index); - index++; - if (preProcess != null) - ret = preProcess.preProcess(ret); - return ret; - } - - @Override - public List getTokens() { - List tokens = new ArrayList<>(); - while (hasMoreTokens()) { - tokens.add(nextToken()); - } - return tokens; - } - - @Override - public void setTokenPreProcessor(TokenPreProcess tokenPreProcessor) { - this.preProcess = tokenPreProcessor; - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/CustomStemmingPreprocessor.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/CustomStemmingPreprocessor.java deleted file mode 100644 index 38a5afdde..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/CustomStemmingPreprocessor.java +++ /dev/null @@ -1,49 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.tokenization.tokenizer.preprocessor; - -import lombok.NonNull; -import org.deeplearning4j.text.tokenization.tokenizer.preprocessor.CommonPreprocessor; -import org.tartarus.snowball.SnowballProgram; - -/** - * This is StemmingPreprocessor compatible with different StemmingProcessors defined as lucene/tartarus SnowballProgram - * such as: RussianStemmer, DutchStemmer, FrenchStemmer etc. - *
- * Note that CommonPreprocessor#preProcess(String) is first applied (i.e. punctuation marks are removed and lower-cased), then the stemmer is applied. - *
- * This preprocessor is synchronized, thus thread-safe. - * - * @author raver119@gmail.com - */ -public class CustomStemmingPreprocessor extends CommonPreprocessor { - private SnowballProgram stemmer; - - public CustomStemmingPreprocessor(@NonNull SnowballProgram stemmer) { - this.stemmer = stemmer; - } - - @Override - public synchronized String preProcess(String token) { - String prep = super.preProcess(token); - stemmer.setCurrent(prep); - stemmer.stem(); - return stemmer.getCurrent(); - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/EmbeddedStemmingPreprocessor.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/EmbeddedStemmingPreprocessor.java deleted file mode 100644 index a855a8f40..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/EmbeddedStemmingPreprocessor.java +++ /dev/null @@ -1,49 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.tokenization.tokenizer.preprocessor; - -import lombok.NoArgsConstructor; -import lombok.NonNull; -import org.deeplearning4j.text.tokenization.tokenizer.TokenPreProcess; -import org.tartarus.snowball.ext.PorterStemmer; - -/** - * This tokenizer preprocessor uses given preprocessor + does english Porter stemming on tokens on top of it - * - * - * @author raver119@gmail.com - */ -@NoArgsConstructor -public class EmbeddedStemmingPreprocessor implements TokenPreProcess { - private TokenPreProcess preProcessor; - - public EmbeddedStemmingPreprocessor(@NonNull TokenPreProcess preProcess) { - this.preProcessor = preProcess; - } - - @Override - public String preProcess(String token) { - String prep = preProcessor == null ? token : preProcessor.preProcess(token); - PorterStemmer stemmer = new PorterStemmer(); - stemmer.setCurrent(prep); - stemmer.stem(); - - return stemmer.getCurrent(); - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/StemmingPreprocessor.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/StemmingPreprocessor.java deleted file mode 100644 index d577eda90..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizer/preprocessor/StemmingPreprocessor.java +++ /dev/null @@ -1,41 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.tokenization.tokenizer.preprocessor; - -import org.deeplearning4j.text.tokenization.tokenizer.preprocessor.CommonPreprocessor; -import org.tartarus.snowball.ext.PorterStemmer; - -/** - * This tokenizer preprocessor implements basic cleaning inherited from CommonPreprocessor + does english Porter stemming on tokens - * - * PLEASE NOTE: This preprocessor is thread-safe by using synchronized method - * - * @author raver119@gmail.com - */ -public class StemmingPreprocessor extends CommonPreprocessor { - @Override - public String preProcess(String token) { - String prep = super.preProcess(token); - PorterStemmer stemmer = new PorterStemmer(); - stemmer.setCurrent(prep); - stemmer.stem(); - - return stemmer.getCurrent(); - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizerfactory/PosUimaTokenizerFactory.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizerfactory/PosUimaTokenizerFactory.java deleted file mode 100644 index c2238b2d0..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizerfactory/PosUimaTokenizerFactory.java +++ /dev/null @@ -1,107 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.tokenization.tokenizerfactory; - - -import org.apache.uima.analysis_engine.AnalysisEngine; -import org.deeplearning4j.nlp.uima.annotator.StemmerAnnotator; -import org.deeplearning4j.nlp.uima.annotator.TokenizerAnnotator; -import org.deeplearning4j.nlp.uima.tokenization.tokenizer.PosUimaTokenizer; -import org.deeplearning4j.nlp.uima.annotator.PoStagger; -import org.deeplearning4j.nlp.uima.annotator.SentenceAnnotator; -import org.deeplearning4j.text.tokenization.tokenizer.TokenPreProcess; -import org.deeplearning4j.text.tokenization.tokenizer.Tokenizer; -import org.deeplearning4j.text.tokenization.tokenizerfactory.TokenizerFactory; - -import java.io.InputStream; -import java.util.Collection; - -import static org.apache.uima.fit.factory.AnalysisEngineFactory.createEngine; -import static org.apache.uima.fit.factory.AnalysisEngineFactory.createEngineDescription; - -/** - * Creates a tokenizer that filters by - * part of speech tags - * @see {org.deeplearning4j.text.tokenization.tokenizer.PosUimaTokenizer} - * @author Adam Gibson - * - */ -public class PosUimaTokenizerFactory implements TokenizerFactory { - - private AnalysisEngine tokenizer; - private Collection allowedPoSTags; - private TokenPreProcess tokenPreProcess; - private boolean stripNones = false; - - public PosUimaTokenizerFactory(Collection allowedPoSTags, boolean stripNones) { - this(defaultAnalysisEngine(), allowedPoSTags); - this.stripNones = stripNones; - } - - public PosUimaTokenizerFactory(Collection allowedPoSTags) { - this(allowedPoSTags, false); - } - - public PosUimaTokenizerFactory(AnalysisEngine tokenizer, Collection allowedPosTags) { - this.tokenizer = tokenizer; - this.allowedPoSTags = allowedPosTags; - } - - - public static AnalysisEngine defaultAnalysisEngine() { - try { - return createEngine(createEngineDescription(SentenceAnnotator.getDescription(), - TokenizerAnnotator.getDescription(), PoStagger.getDescription("en"), - StemmerAnnotator.getDescription("English"))); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - - @Override - public Tokenizer create(String toTokenize) { - PosUimaTokenizer t = new PosUimaTokenizer(toTokenize, tokenizer, allowedPoSTags, stripNones); - if (tokenPreProcess != null) - t.setTokenPreProcessor(tokenPreProcess); - return t; - } - - @Override - public Tokenizer create(InputStream toTokenize) { - throw new UnsupportedOperationException(); - } - - @Override - public void setTokenPreProcessor(TokenPreProcess preProcessor) { - this.tokenPreProcess = preProcessor; - } - - /** - * Returns TokenPreProcessor set for this TokenizerFactory instance - * - * @return TokenPreProcessor instance, or null if no preprocessor was defined - */ - @Override - public TokenPreProcess getTokenPreProcessor() { - return tokenPreProcess; - } - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizerfactory/UimaTokenizerFactory.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizerfactory/UimaTokenizerFactory.java deleted file mode 100644 index 2fe9d8d79..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/tokenization/tokenizerfactory/UimaTokenizerFactory.java +++ /dev/null @@ -1,134 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.tokenization.tokenizerfactory; - -import org.apache.uima.analysis_engine.AnalysisEngine; -import org.apache.uima.fit.factory.AnalysisEngineFactory; -import org.apache.uima.resource.ResourceInitializationException; -import org.deeplearning4j.nlp.uima.annotator.TokenizerAnnotator; -import org.deeplearning4j.nlp.uima.tokenization.tokenizer.UimaTokenizer; -import org.deeplearning4j.nlp.uima.annotator.SentenceAnnotator; -import org.deeplearning4j.text.tokenization.tokenizer.TokenPreProcess; -import org.deeplearning4j.text.tokenization.tokenizer.Tokenizer; -import org.deeplearning4j.text.tokenization.tokenizerfactory.TokenizerFactory; -import org.deeplearning4j.nlp.uima.uima.UimaResource; - -import java.io.InputStream; - - -/** - * Uses a uima {@link AnalysisEngine} to - * tokenize text. - * - * - * @author Adam Gibson - * - */ -public class UimaTokenizerFactory implements TokenizerFactory { - - private UimaResource uimaResource; - private boolean checkForLabel; - private static AnalysisEngine defaultAnalysisEngine; - private TokenPreProcess preProcess; - - public UimaTokenizerFactory() throws ResourceInitializationException { - this(defaultAnalysisEngine(), true); - } - - public UimaTokenizerFactory(UimaResource resource) { - this(resource, true); - } - - public UimaTokenizerFactory(AnalysisEngine tokenizer) { - this(tokenizer, true); - } - - public UimaTokenizerFactory(UimaResource resource, boolean checkForLabel) { - this.uimaResource = resource; - this.checkForLabel = checkForLabel; - } - - public UimaTokenizerFactory(boolean checkForLabel) throws ResourceInitializationException { - this(defaultAnalysisEngine(), checkForLabel); - } - - public UimaTokenizerFactory(AnalysisEngine tokenizer, boolean checkForLabel) { - super(); - this.checkForLabel = checkForLabel; - try { - this.uimaResource = new UimaResource(tokenizer); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - @Override - public Tokenizer create(String toTokenize) { - if (toTokenize == null) - throw new IllegalArgumentException("Unable to proceed; on sentence to tokenize"); - Tokenizer ret = new UimaTokenizer(toTokenize, uimaResource, checkForLabel); - ret.setTokenPreProcessor(preProcess); - return ret; - } - - public UimaResource getUimaResource() { - return uimaResource; - } - - - /** - * Creates a tokenization,/stemming pipeline - * @return a tokenization/stemming pipeline - */ - public static AnalysisEngine defaultAnalysisEngine() { - try { - if (defaultAnalysisEngine == null) - defaultAnalysisEngine = AnalysisEngineFactory.createEngine( - AnalysisEngineFactory.createEngineDescription(SentenceAnnotator.getDescription(), - TokenizerAnnotator.getDescription())); - - return defaultAnalysisEngine; - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - - @Override - public Tokenizer create(InputStream toTokenize) { - throw new UnsupportedOperationException(); - } - - @Override - public void setTokenPreProcessor(TokenPreProcess preProcessor) { - this.preProcess = preProcessor; - } - - /** - * Returns TokenPreProcessor set for this TokenizerFactory instance - * - * @return TokenPreProcessor instance, or null if no preprocessor was defined - */ - @Override - public TokenPreProcess getTokenPreProcessor() { - return preProcess; - } - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/uima/UimaResource.java b/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/uima/UimaResource.java deleted file mode 100644 index c5a5cf9f1..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/main/java/org/deeplearning4j/nlp/uima/uima/UimaResource.java +++ /dev/null @@ -1,113 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.nlp.uima.uima; - -import org.apache.uima.analysis_engine.AnalysisEngine; -import org.apache.uima.analysis_engine.AnalysisEngineProcessException; -import org.apache.uima.cas.CAS; -import org.apache.uima.resource.ResourceInitializationException; -import org.apache.uima.util.CasPool; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -/** - * Resource holder for uima - * @author Adam Gibson - * - */ -public class UimaResource { - - private AnalysisEngine analysisEngine; - private CasPool casPool; - private static final Logger log = LoggerFactory.getLogger(UimaResource.class); - - public UimaResource(AnalysisEngine analysisEngine) throws ResourceInitializationException { - this.analysisEngine = analysisEngine; - this.casPool = new CasPool(Runtime.getRuntime().availableProcessors(), analysisEngine); - - } - - public UimaResource(AnalysisEngine analysisEngine, CasPool casPool) { - this.analysisEngine = analysisEngine; - this.casPool = casPool; - - } - - - public AnalysisEngine getAnalysisEngine() { - return analysisEngine; - } - - - public void setAnalysisEngine(AnalysisEngine analysisEngine) { - this.analysisEngine = analysisEngine; - } - - - public CasPool getCasPool() { - return casPool; - } - - - public void setCasPool(CasPool casPool) { - this.casPool = casPool; - } - - - /** - * Use the given analysis engine and process the given text - * You must release the return cas yourself - * @param text the text to process - * @return the processed cas - */ - public CAS process(String text) { - CAS cas = retrieve(); - if (cas == null) - return null; - - cas.setDocumentText(text); - try { - analysisEngine.process(cas); - } catch (AnalysisEngineProcessException e) { - log.warn("Unable to process text " + text, e); - } - - return cas; - - - } - - - public CAS retrieve() { - CAS ret = casPool.getCas(); - try { - return ret == null ? analysisEngine.newCAS() : ret; - } catch (ResourceInitializationException e) { - throw new RuntimeException(e); - } - } - - - public void release(CAS cas) { - casPool.releaseCas(cas); - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/AssertTestsExtendBaseClass.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/AssertTestsExtendBaseClass.java deleted file mode 100644 index 18eece77c..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/AssertTestsExtendBaseClass.java +++ /dev/null @@ -1,51 +0,0 @@ - -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ -package org.deeplearning4j; - -import lombok.extern.slf4j.Slf4j; -import java.util.*; -import org.nd4j.common.tests.AbstractAssertTestsClass; - -/** - * This class checks that all test classes (i.e., anything with one or more methods annotated with @Test) - * extends BaseDl4JTest - either directly or indirectly. - * Other than a small set of exceptions, all tests must extend this - * - * @author Alex Black - * @author Alexander Stoyakin - */ -@Slf4j -public class AssertTestsExtendBaseClass extends AbstractAssertTestsClass { - - @Override - protected Set> getExclusions() { - Set> exclusions = new HashSet<>(); - return exclusions; - } - - @Override - protected String getPackageName() { - return "org.deeplearning4j"; - } - - @Override - protected Class getBaseClass() { - return BaseDL4JTest.class; - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/UITest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/UITest.java deleted file mode 100644 index 877c38d80..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/UITest.java +++ /dev/null @@ -1,83 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.models; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.models.embeddings.loader.WordVectorSerializer; -import org.deeplearning4j.models.embeddings.wordvectors.WordVectors; -import org.deeplearning4j.models.word2vec.Word2Vec; -import org.deeplearning4j.plot.BarnesHutTsne; -import org.deeplearning4j.text.sentenceiterator.SentenceIterator; -import org.deeplearning4j.nlp.uima.sentenceiterator.UimaSentenceIterator; -import org.deeplearning4j.text.tokenization.tokenizer.preprocessor.CommonPreprocessor; -import org.deeplearning4j.text.tokenization.tokenizerfactory.DefaultTokenizerFactory; -import org.deeplearning4j.text.tokenization.tokenizerfactory.TokenizerFactory; -import org.deeplearning4j.core.ui.UiConnectionInfo; -import org.deeplearning4j.ui.api.UIServer; -import org.junit.Ignore; -import org.junit.Test; -import org.nd4j.common.io.ClassPathResource; - -import java.io.File; -import java.util.ArrayList; - -/** - * Created by Alex on 10/01/2017. - */ -@Ignore -public class UITest extends BaseDL4JTest { - - @Test - public void testPosting() throws Exception { - - // File inputFile = Resources.asFile("big/raw_sentences.txt"); - File inputFile = new ClassPathResource("/basic/word2vec_advance.txt").getFile(); - SentenceIterator iter = UimaSentenceIterator.createWithPath(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(1).epochs(1).layerSize(20) - .stopWords(new ArrayList()).useAdaGrad(false).negativeSample(5).seed(42).windowSize(5) - .iterate(iter).tokenizerFactory(t).build(); - - vec.fit(); - - File tempFile = File.createTempFile("temp", "w2v"); - tempFile.deleteOnExit(); - - WordVectorSerializer.writeWordVectors(vec, tempFile); - - WordVectors vectors = WordVectorSerializer.loadTxtVectors(tempFile); - - UIServer.getInstance(); //Initialize - - UiConnectionInfo uiConnectionInfo = - new UiConnectionInfo.Builder().setAddress("localhost").setPort(9000).build(); - - BarnesHutTsne tsne = new BarnesHutTsne.Builder().normalize(false).setFinalMomentum(0.8f).numDimension(2) - .setMaxIter(10).build(); - - vectors.lookupTable().plotVocab(tsne, vectors.lookupTable().getVocabCache().numWords(), uiConnectionInfo); - - - Thread.sleep(100000); - } - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/WordVectorSerializerTest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/WordVectorSerializerTest.java deleted file mode 100644 index 4a12e8811..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/WordVectorSerializerTest.java +++ /dev/null @@ -1,914 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.models; - -import lombok.val; -import org.apache.commons.io.FileUtils; -import org.apache.commons.lang.ArrayUtils; -import org.apache.commons.lang3.RandomUtils; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.models.embeddings.WeightLookupTable; -import org.deeplearning4j.models.embeddings.inmemory.InMemoryLookupTable; -import org.deeplearning4j.models.embeddings.loader.VectorsConfiguration; -import org.deeplearning4j.models.embeddings.loader.WordVectorSerializer; -import org.deeplearning4j.models.embeddings.wordvectors.WordVectors; -import org.deeplearning4j.models.paragraphvectors.ParagraphVectors; -import org.deeplearning4j.models.sequencevectors.SequenceVectors; -import org.deeplearning4j.models.sequencevectors.serialization.VocabWordFactory; -import org.deeplearning4j.models.word2vec.VocabWord; -import org.deeplearning4j.models.word2vec.Word2Vec; -import org.deeplearning4j.models.word2vec.wordstore.VocabCache; -import org.deeplearning4j.models.word2vec.wordstore.inmemory.AbstractCache; -import org.deeplearning4j.models.word2vec.wordstore.inmemory.InMemoryLookupCache; -import org.deeplearning4j.text.sentenceiterator.BasicLineIterator; -import org.deeplearning4j.text.sentenceiterator.SentenceIterator; -import org.deeplearning4j.nlp.uima.sentenceiterator.UimaSentenceIterator; -import org.deeplearning4j.text.tokenization.tokenizer.preprocessor.CommonPreprocessor; -import org.deeplearning4j.text.tokenization.tokenizerfactory.DefaultTokenizerFactory; -import org.deeplearning4j.text.tokenization.tokenizerfactory.TokenizerFactory; -import org.junit.Before; -import org.junit.Ignore; -import org.junit.Rule; -import org.junit.Test; -import org.junit.rules.TemporaryFolder; -import org.junit.rules.Timeout; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.common.io.ClassPathResource; -import org.nd4j.linalg.ops.transforms.Transforms; -import org.nd4j.common.resources.Resources; -import org.nd4j.shade.guava.primitives.Doubles; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.io.File; -import java.io.FileInputStream; -import java.io.FileOutputStream; -import java.io.IOException; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.Collection; -import java.util.List; - -import static org.junit.Assert.*; - -/** - * @author jeffreytang - * @author raver119@gmail.com - */ -public class WordVectorSerializerTest extends BaseDL4JTest { - - @Rule - public TemporaryFolder testDir = new TemporaryFolder(); - - @Rule - public Timeout timeout = Timeout.seconds(300); - - private File textFile, binaryFile, textFile2; - private File fastTextRaw, fastTextZip, fastTextGzip; - String pathToWriteto; - - private Logger logger = LoggerFactory.getLogger(WordVectorSerializerTest.class); - - @Before - public void before() throws Exception { - if (textFile == null) { - textFile = new ClassPathResource("word2vecserialization/google_news_30.txt").getFile(); - } - if (binaryFile == null) { - File dir = testDir.newFolder(); - binaryFile = new File(dir, "google_news_30.bin.gz"); - FileUtils.copyFile(new ClassPathResource("word2vecserialization/google_news_30.bin.gz").getFile(), binaryFile); - } - pathToWriteto = new ClassPathResource("word2vecserialization/testing_word2vec_serialization.txt").getFile() - .getAbsolutePath(); - if (fastTextRaw == null) { - fastTextRaw = new ClassPathResource("word2vecserialization/fast_text.vec").getFile(); - } - if (fastTextZip == null) { - File dir = testDir.newFolder(); - fastTextZip = new File(dir, "fast_text.vec.zip"); - FileUtils.copyFile(new ClassPathResource("word2vecserialization/fast_text.vec.zip").getFile(), fastTextZip); - } - if (fastTextGzip == null) { - File dir = testDir.newFolder(); - fastTextGzip = new File(dir, "fast_text.vec.gz"); - FileUtils.copyFile(new ClassPathResource("word2vecserialization/fast_text.vec.gz").getFile(), fastTextGzip); - } - FileUtils.deleteDirectory(new File("word2vec-index")); - } - - @Test - @Ignore - public void testLoaderTextSmall() throws Exception { - INDArray vec = Nd4j.create(new double[] {0.002001, 0.002210, -0.001915, -0.001639, 0.000683, 0.001511, 0.000470, - 0.000106, -0.001802, 0.001109, -0.002178, 0.000625, -0.000376, -0.000479, -0.001658, -0.000941, - 0.001290, 0.001513, 0.001485, 0.000799, 0.000772, -0.001901, -0.002048, 0.002485, 0.001901, - 0.001545, -0.000302, 0.002008, -0.000247, 0.000367, -0.000075, -0.001492, 0.000656, -0.000669, - -0.001913, 0.002377, 0.002190, -0.000548, -0.000113, 0.000255, -0.001819, -0.002004, 0.002277, - 0.000032, -0.001291, -0.001521, -0.001538, 0.000848, 0.000101, 0.000666, -0.002107, -0.001904, - -0.000065, 0.000572, 0.001275, -0.001585, 0.002040, 0.000463, 0.000560, -0.000304, 0.001493, - -0.001144, -0.001049, 0.001079, -0.000377, 0.000515, 0.000902, -0.002044, -0.000992, 0.001457, - 0.002116, 0.001966, -0.001523, -0.001054, -0.000455, 0.001001, -0.001894, 0.001499, 0.001394, - -0.000799, -0.000776, -0.001119, 0.002114, 0.001956, -0.000590, 0.002107, 0.002410, 0.000908, - 0.002491, -0.001556, -0.000766, -0.001054, -0.001454, 0.001407, 0.000790, 0.000212, -0.001097, - 0.000762, 0.001530, 0.000097, 0.001140, -0.002476, 0.002157, 0.000240, -0.000916, -0.001042, - -0.000374, -0.001468, -0.002185, -0.001419, 0.002139, -0.000885, -0.001340, 0.001159, -0.000852, - 0.002378, -0.000802, -0.002294, 0.001358, -0.000037, -0.001744, 0.000488, 0.000721, -0.000241, - 0.000912, -0.001979, 0.000441, 0.000908, -0.001505, 0.000071, -0.000030, -0.001200, -0.001416, - -0.002347, 0.000011, 0.000076, 0.000005, -0.001967, -0.002481, -0.002373, -0.002163, -0.000274, - 0.000696, 0.000592, -0.001591, 0.002499, -0.001006, -0.000637, -0.000702, 0.002366, -0.001882, - 0.000581, -0.000668, 0.001594, 0.000020, 0.002135, -0.001410, -0.001303, -0.002096, -0.001833, - -0.001600, -0.001557, 0.001222, -0.000933, 0.001340, 0.001845, 0.000678, 0.001475, 0.001238, - 0.001170, -0.001775, -0.001717, -0.001828, -0.000066, 0.002065, -0.001368, -0.001530, -0.002098, - 0.001653, -0.002089, -0.000290, 0.001089, -0.002309, -0.002239, 0.000721, 0.001762, 0.002132, - 0.001073, 0.001581, -0.001564, -0.001820, 0.001987, -0.001382, 0.000877, 0.000287, 0.000895, - -0.000591, 0.000099, -0.000843, -0.000563}); - String w1 = "database"; - String w2 = "DBMS"; - WordVectors vecModel = WordVectorSerializer.readWord2VecModel(new ClassPathResource("word2vec/googleload/sample_vec.txt").getFile()); - WordVectors vectorsBinary = WordVectorSerializer.readWord2VecModel(new ClassPathResource("word2vec/googleload/sample_vec.bin").getFile()); - INDArray textWeights = vecModel.lookupTable().getWeights(); - INDArray binaryWeights = vectorsBinary.lookupTable().getWeights(); - Collection nearest = vecModel.wordsNearest("database", 10); - Collection nearestBinary = vectorsBinary.wordsNearest("database", 10); - System.out.println(nearestBinary); - assertEquals(vecModel.similarity("DBMS", "DBMS's"), vectorsBinary.similarity("DBMS", "DBMS's"), 1e-1); - - } - - @Test - public void testLoaderText() throws IOException { - WordVectors vec = WordVectorSerializer.readWord2VecModel(textFile); - assertEquals(vec.vocab().numWords(), 30); - assertTrue(vec.vocab().hasToken("Morgan_Freeman")); - assertTrue(vec.vocab().hasToken("JA_Montalbano")); - } - - @Test - public void testLoaderStream() throws IOException { - WordVectors vec = WordVectorSerializer.readWord2VecModel(textFile); - - assertEquals(vec.vocab().numWords(), 30); - assertTrue(vec.vocab().hasToken("Morgan_Freeman")); - assertTrue(vec.vocab().hasToken("JA_Montalbano")); - } - - @Test - public void testLoaderBinary() throws IOException { - WordVectors vec = WordVectorSerializer.readWord2VecModel(binaryFile); - assertEquals(vec.vocab().numWords(), 30); - assertTrue(vec.vocab().hasToken("Morgan_Freeman")); - assertTrue(vec.vocab().hasToken("JA_Montalbano")); - double[] wordVector1 = vec.getWordVector("Morgan_Freeman"); - double[] wordVector2 = vec.getWordVector("JA_Montalbano"); - assertTrue(wordVector1.length == 300); - assertTrue(wordVector2.length == 300); - assertEquals(Doubles.asList(wordVector1).get(0), 0.044423, 1e-3); - assertEquals(Doubles.asList(wordVector2).get(0), 0.051964, 1e-3); - } - - @Test - @Ignore - public void testWriteWordVectors() throws IOException { - WordVectors vec = WordVectorSerializer.readWord2VecModel(binaryFile); - InMemoryLookupTable lookupTable = (InMemoryLookupTable) vec.lookupTable(); - InMemoryLookupCache lookupCache = (InMemoryLookupCache) vec.vocab(); - WordVectorSerializer.writeWordVectors(lookupTable, lookupCache, pathToWriteto); - - WordVectors wordVectors = WordVectorSerializer.loadTxtVectors(new File(pathToWriteto)); - double[] wordVector1 = wordVectors.getWordVector("Morgan_Freeman"); - double[] wordVector2 = wordVectors.getWordVector("JA_Montalbano"); - assertTrue(wordVector1.length == 300); - assertTrue(wordVector2.length == 300); - assertEquals(Doubles.asList(wordVector1).get(0), 0.044423, 1e-3); - assertEquals(Doubles.asList(wordVector2).get(0), 0.051964, 1e-3); - } - - @Test - @Ignore - public void testWriteWordVectorsFromWord2Vec() throws IOException { - WordVectors vec = WordVectorSerializer.readWord2VecModel(binaryFile, true); - WordVectorSerializer.writeWordVectors((Word2Vec) vec, pathToWriteto); - - WordVectors wordVectors = WordVectorSerializer.loadTxtVectors(new File(pathToWriteto)); - INDArray wordVector1 = wordVectors.getWordVectorMatrix("Morgan_Freeman"); - INDArray wordVector2 = wordVectors.getWordVectorMatrix("JA_Montalbano"); - assertEquals(vec.getWordVectorMatrix("Morgan_Freeman"), wordVector1); - assertEquals(vec.getWordVectorMatrix("JA_Montalbano"), wordVector2); - assertTrue(wordVector1.length() == 300); - assertTrue(wordVector2.length() == 300); - assertEquals(wordVector1.getDouble(0), 0.044423, 1e-3); - assertEquals(wordVector2.getDouble(0), 0.051964, 1e-3); - } - - @Test - @Ignore - public void testFromTableAndVocab() throws IOException { - - WordVectors vec = WordVectorSerializer.readWord2VecModel(textFile); - InMemoryLookupTable lookupTable = (InMemoryLookupTable) vec.lookupTable(); - InMemoryLookupCache lookupCache = (InMemoryLookupCache) vec.vocab(); - - WordVectors wordVectors = WordVectorSerializer.fromTableAndVocab(lookupTable, lookupCache); - double[] wordVector1 = wordVectors.getWordVector("Morgan_Freeman"); - double[] wordVector2 = wordVectors.getWordVector("JA_Montalbano"); - assertTrue(wordVector1.length == 300); - assertTrue(wordVector2.length == 300); - assertEquals(Doubles.asList(wordVector1).get(0), 0.044423, 1e-3); - assertEquals(Doubles.asList(wordVector2).get(0), 0.051964, 1e-3); - } - - @Test - @Ignore("AB 2019/06/24 - Failing: Ignored to get to all passing baseline to prevent regressions via CI - see issue #7912") - public void testIndexPersistence() throws Exception { - File inputFile = Resources.asFile("big/raw_sentences.txt"); - SentenceIterator iter = UimaSentenceIterator.createWithPath(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(5).iterations(1).epochs(1).layerSize(100) - .stopWords(new ArrayList()).useAdaGrad(false).negativeSample(5).seed(42).windowSize(5) - .iterate(iter).tokenizerFactory(t).build(); - - vec.fit(); - - VocabCache orig = vec.getVocab(); - - File tempFile = File.createTempFile("temp", "w2v"); - tempFile.deleteOnExit(); - - WordVectorSerializer.writeWordVectors(vec, tempFile); - - WordVectors vec2 = WordVectorSerializer.loadTxtVectors(tempFile); - - VocabCache rest = vec2.vocab(); - - assertEquals(orig.totalNumberOfDocs(), rest.totalNumberOfDocs()); - - for (VocabWord word : vec.getVocab().vocabWords()) { - INDArray array1 = vec.getWordVectorMatrix(word.getLabel()); - INDArray array2 = vec2.getWordVectorMatrix(word.getLabel()); - - assertEquals(array1, array2); - } - } - - @Test - public void testFullModelSerialization() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - File inputFile = Resources.asFile("big/raw_sentences.txt"); - - - SentenceIterator iter = UimaSentenceIterator.createWithPath(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - InMemoryLookupCache cache = new InMemoryLookupCache(false); - WeightLookupTable table = new InMemoryLookupTable.Builder().vectorLength(100).useAdaGrad(false).negative(5.0) - .cache(cache).lr(0.025f).build(); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(5).iterations(1).epochs(1).layerSize(100) - .lookupTable(table).stopWords(new ArrayList()).useAdaGrad(false).negativeSample(5) - .vocabCache(cache).seed(42) - // .workers(6) - .windowSize(5).iterate(iter).tokenizerFactory(t).build(); - - assertEquals(new ArrayList(), vec.getStopWords()); - vec.fit(); - - //logger.info("Original word 0: " + cache.wordFor(cache.wordAtIndex(0))); - - //logger.info("Closest Words:"); - Collection lst = vec.wordsNearest("day", 10); - System.out.println(lst); - - WordVectorSerializer.writeFullModel(vec, "tempModel.txt"); - - File modelFile = new File("tempModel.txt"); - modelFile.deleteOnExit(); - - assertTrue(modelFile.exists()); - assertTrue(modelFile.length() > 0); - - - Word2Vec vec2 = WordVectorSerializer.loadFullModel("tempModel.txt"); - - assertNotEquals(null, vec2); - - assertEquals(vec.getConfiguration(), vec2.getConfiguration()); - - //logger.info("Source ExpTable: " + ArrayUtils.toString(((InMemoryLookupTable) table).getExpTable())); - //logger.info("Dest ExpTable: " + ArrayUtils.toString(((InMemoryLookupTable) vec2.getLookupTable()).getExpTable())); - assertTrue(ArrayUtils.isEquals(((InMemoryLookupTable) table).getExpTable(), - ((InMemoryLookupTable) vec2.getLookupTable()).getExpTable())); - - - InMemoryLookupTable restoredTable = (InMemoryLookupTable) vec2.lookupTable(); - - /* - logger.info("Restored word 1: " + restoredTable.getVocab().wordFor(restoredTable.getVocab().wordAtIndex(1))); - logger.info("Restored word 'it': " + restoredTable.getVocab().wordFor("it")); - logger.info("Original word 1: " + cache.wordFor(cache.wordAtIndex(1))); - logger.info("Original word 'i': " + cache.wordFor("i")); - logger.info("Original word 0: " + cache.wordFor(cache.wordAtIndex(0))); - logger.info("Restored word 0: " + restoredTable.getVocab().wordFor(restoredTable.getVocab().wordAtIndex(0))); - */ - assertEquals(cache.wordAtIndex(1), restoredTable.getVocab().wordAtIndex(1)); - assertEquals(cache.wordAtIndex(7), restoredTable.getVocab().wordAtIndex(7)); - assertEquals(cache.wordAtIndex(15), restoredTable.getVocab().wordAtIndex(15)); - - /* - these tests needed only to make sure INDArray equality is working properly - */ - double[] array1 = new double[] {0.323232325, 0.65756575, 0.12315, 0.12312315, 0.1232135, 0.12312315, - 0.4343423425, 0.15}; - double[] array2 = new double[] {0.423232325, 0.25756575, 0.12375, 0.12311315, 0.1232035, 0.12318315, - 0.4343493425, 0.25}; - assertNotEquals(Nd4j.create(array1), Nd4j.create(array2)); - assertEquals(Nd4j.create(array1), Nd4j.create(array1)); - - - INDArray rSyn0_1 = restoredTable.getSyn0().slice(1); - INDArray oSyn0_1 = ((InMemoryLookupTable) table).getSyn0().slice(1); - - //logger.info("Restored syn0: " + rSyn0_1); - //logger.info("Original syn0: " + oSyn0_1); - - assertEquals(oSyn0_1, rSyn0_1); - - // just checking $^###! syn0/syn1 order - int cnt = 0; - for (VocabWord word : cache.vocabWords()) { - INDArray rSyn0 = restoredTable.getSyn0().slice(word.getIndex()); - INDArray oSyn0 = ((InMemoryLookupTable) table).getSyn0().slice(word.getIndex()); - - assertEquals(rSyn0, oSyn0); - assertEquals(1.0, arraysSimilarity(rSyn0, oSyn0), 0.001); - - INDArray rSyn1 = restoredTable.getSyn1().slice(word.getIndex()); - INDArray oSyn1 = ((InMemoryLookupTable) table).getSyn1().slice(word.getIndex()); - - assertEquals(rSyn1, oSyn1); - if (arraysSimilarity(rSyn1, oSyn1) < 0.98) { - // logger.info("Restored syn1: " + rSyn1); - // logger.info("Original syn1: " + oSyn1); - } - // we exclude word 222 since it has syn1 full of zeroes - if (cnt != 222) - assertEquals(1.0, arraysSimilarity(rSyn1, oSyn1), 0.001); - - - - if (((InMemoryLookupTable) table).getSyn1Neg() != null) { - INDArray rSyn1Neg = restoredTable.getSyn1Neg().slice(word.getIndex()); - INDArray oSyn1Neg = ((InMemoryLookupTable) table).getSyn1Neg().slice(word.getIndex()); - - assertEquals(rSyn1Neg, oSyn1Neg); - // assertEquals(1.0, arraysSimilarity(rSyn1Neg, oSyn1Neg), 0.001); - } - assertEquals(word.getHistoricalGradient(), - restoredTable.getVocab().wordFor(word.getWord()).getHistoricalGradient()); - - cnt++; - } - - // at this moment we can assume that whole model is transferred, and we can call fit over new model - // iter.reset(); - - iter = UimaSentenceIterator.createWithPath(inputFile.getAbsolutePath()); - - vec2.setTokenizerFactory(t); - vec2.setSentenceIterator(iter); - - vec2.fit(); - - INDArray day1 = vec.getWordVectorMatrix("day"); - INDArray day2 = vec2.getWordVectorMatrix("day"); - - INDArray night1 = vec.getWordVectorMatrix("night"); - INDArray night2 = vec2.getWordVectorMatrix("night"); - - double simD = arraysSimilarity(day1, day2); - double simN = arraysSimilarity(night1, night2); - -// logger.info("Vec1 day: " + day1); -// logger.info("Vec2 day: " + day2); - -// logger.info("Vec1 night: " + night1); -// logger.info("Vec2 night: " + night2); - - logger.info("Day/day cross-model similarity: " + simD); - logger.info("Night/night cross-model similarity: " + simN); - - - - logger.info("Vec1 day/night similiraty: " + vec.similarity("day", "night")); - logger.info("Vec2 day/night similiraty: " + vec2.similarity("day", "night")); - - // check if cross-model values are not the same - assertNotEquals(1.0, simD, 0.001); - assertNotEquals(1.0, simN, 0.001); - - // check if cross-model values are still close to each other - assertTrue(simD > 0.70); - assertTrue(simN > 0.70); - - modelFile.delete(); - } - - @Test - @Ignore - public void testLoader() throws Exception { - WordVectors vec = WordVectorSerializer.loadTxtVectors(new File("/home/raver119/Downloads/_vectors.txt")); - - logger.info("Rewinding: " + Arrays.toString(vec.getWordVector("rewinding"))); - } - - - @Test - @Ignore("AB 2019/06/24 - Failing: Ignored to get to all passing baseline to prevent regressions via CI - see issue #7912") - public void testOutputStream() throws Exception { - File file = File.createTempFile("tmp_ser", "ssa"); - file.deleteOnExit(); - - File inputFile = Resources.asFile("big/raw_sentences.txt"); - SentenceIterator iter = new BasicLineIterator(inputFile); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - InMemoryLookupCache cache = new InMemoryLookupCache(false); - WeightLookupTable table = new InMemoryLookupTable.Builder().vectorLength(100).useAdaGrad(false).negative(5.0) - .cache(cache).lr(0.025f).build(); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(5).iterations(1).epochs(1).layerSize(100) - .lookupTable(table).stopWords(new ArrayList()).useAdaGrad(false).negativeSample(5) - .vocabCache(cache).seed(42) - // .workers(6) - .windowSize(5).iterate(iter).tokenizerFactory(t).build(); - - assertEquals(new ArrayList(), vec.getStopWords()); - vec.fit(); - - INDArray day1 = vec.getWordVectorMatrix("day"); - - WordVectorSerializer.writeWordVectors(vec, new FileOutputStream(file)); - - WordVectors vec2 = WordVectorSerializer.loadTxtVectors(file); - - INDArray day2 = vec2.getWordVectorMatrix("day"); - - assertEquals(day1, day2); - - File tempFile = File.createTempFile("tetsts", "Fdfs"); - tempFile.deleteOnExit(); - - WordVectorSerializer.writeWord2VecModel(vec, tempFile); - - Word2Vec vec3 = WordVectorSerializer.readWord2VecModel(tempFile); - } - - @Test - @Ignore("AB 2019/06/24 - Failing: Ignored to get to all passing baseline to prevent regressions via CI - see issue #7912") - public void testParaVecSerialization1() throws Exception { - VectorsConfiguration configuration = new VectorsConfiguration(); - configuration.setIterations(14123); - configuration.setLayersSize(156); - - INDArray syn0 = Nd4j.rand(100, configuration.getLayersSize()); - INDArray syn1 = Nd4j.rand(100, configuration.getLayersSize()); - - AbstractCache cache = new AbstractCache.Builder().build(); - - for (int i = 0; i < 100; i++) { - VocabWord word = new VocabWord((float) i, "word_" + i); - List points = new ArrayList<>(); - List codes = new ArrayList<>(); - int num = RandomUtils.nextInt(1, 20); - for (int x = 0; x < num; x++) { - points.add(RandomUtils.nextInt(1, 100000)); - codes.add(RandomUtils.nextBytes(10)[0]); - } - if (RandomUtils.nextInt(0, 10) < 3) { - word.markAsLabel(true); - } - word.setIndex(i); - word.setPoints(points); - word.setCodes(codes); - cache.addToken(word); - cache.addWordToIndex(i, word.getLabel()); - } - - InMemoryLookupTable lookupTable = - (InMemoryLookupTable) new InMemoryLookupTable.Builder() - .vectorLength(configuration.getLayersSize()).cache(cache).build(); - - lookupTable.setSyn0(syn0); - lookupTable.setSyn1(syn1); - - ParagraphVectors originalVectors = - new ParagraphVectors.Builder(configuration).vocabCache(cache).lookupTable(lookupTable).build(); - - File tempFile = File.createTempFile("paravec", "tests"); - tempFile.deleteOnExit(); - - WordVectorSerializer.writeParagraphVectors(originalVectors, tempFile); - - ParagraphVectors restoredVectors = WordVectorSerializer.readParagraphVectors(tempFile); - - InMemoryLookupTable restoredLookupTable = - (InMemoryLookupTable) restoredVectors.getLookupTable(); - AbstractCache restoredVocab = (AbstractCache) restoredVectors.getVocab(); - - assertEquals(restoredLookupTable.getSyn0(), lookupTable.getSyn0()); - assertEquals(restoredLookupTable.getSyn1(), lookupTable.getSyn1()); - - for (int i = 0; i < cache.numWords(); i++) { - assertEquals(cache.elementAtIndex(i).isLabel(), restoredVocab.elementAtIndex(i).isLabel()); - assertEquals(cache.wordAtIndex(i), restoredVocab.wordAtIndex(i)); - assertEquals(cache.elementAtIndex(i).getElementFrequency(), - restoredVocab.elementAtIndex(i).getElementFrequency(), 0.1f); - List originalPoints = cache.elementAtIndex(i).getPoints(); - List restoredPoints = restoredVocab.elementAtIndex(i).getPoints(); - assertEquals(originalPoints.size(), restoredPoints.size()); - for (int x = 0; x < originalPoints.size(); x++) { - assertEquals(originalPoints.get(x), restoredPoints.get(x)); - } - - List originalCodes = cache.elementAtIndex(i).getCodes(); - List restoredCodes = restoredVocab.elementAtIndex(i).getCodes(); - assertEquals(originalCodes.size(), restoredCodes.size()); - for (int x = 0; x < originalCodes.size(); x++) { - assertEquals(originalCodes.get(x), restoredCodes.get(x)); - } - } - } - - private double arraysSimilarity(INDArray array1, INDArray array2) { - if (array1.equals(array2)) - return 1.0; - - INDArray vector = Transforms.unitVec(array1); - INDArray vector2 = Transforms.unitVec(array2); - if (vector == null || vector2 == null) - return -1; - return Nd4j.getBlasWrapper().dot(vector, vector2); - - } - - - /** - * This method here is only to test real google model few gigabytes worth - * Keep it ignored, since it requirs full google model being present in system, which is 1.6gb compressed - * - * @throws Exception - */ - @Test - @Ignore - public void testStaticLoaderGoogleModel() throws Exception { - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - long time1 = System.currentTimeMillis(); - WordVectors vectors = WordVectorSerializer - .loadStaticModel(new File("C:\\Users\\raver\\develop\\GoogleNews-vectors-negative300.bin.gz")); - long time2 = System.currentTimeMillis(); - - logger.info("Loading time: {} ms", (time2 - time1)); - } - - /** - * This method tests binary file loading as static model - * - * @throws Exception - */ - @Test - @Ignore("AB 2019/06/24 - Failing: Ignored to get to all passing baseline to prevent regressions via CI - see issue #7912") - public void testStaticLoaderBinary() throws Exception { - - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - WordVectors vectorsLive = WordVectorSerializer.readWord2VecModel(binaryFile); - WordVectors vectorsStatic = WordVectorSerializer.loadStaticModel(binaryFile); - - INDArray arrayLive = vectorsLive.getWordVectorMatrix("Morgan_Freeman"); - INDArray arrayStatic = vectorsStatic.getWordVectorMatrix("Morgan_Freeman"); - - assertNotEquals(null, arrayLive); - assertEquals(arrayLive, arrayStatic); - } - - @Test - @Ignore("AB 2019/06/24 - Failing: Ignored to get to all passing baseline to prevent regressions via CI - see issue #7912") - public void testStaticLoaderFromStream() throws Exception { - - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - WordVectors vectorsLive = WordVectorSerializer.readWord2VecModel(binaryFile); - WordVectors vectorsStatic = WordVectorSerializer.loadStaticModel(new FileInputStream(binaryFile)); - - INDArray arrayLive = vectorsLive.getWordVectorMatrix("Morgan_Freeman"); - INDArray arrayStatic = vectorsStatic.getWordVectorMatrix("Morgan_Freeman"); - - assertNotEquals(null, arrayLive); - assertEquals(arrayLive, arrayStatic); - } - - /** - * This method tests CSV file loading as static model - * - * @throws Exception - */ - @Test - @Ignore("AB 2019/06/24 - Failing: Ignored to get to all passing baseline to prevent regressions via CI - see issue #7912") - public void testStaticLoaderText() throws Exception { - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - WordVectors vectorsLive = WordVectorSerializer.loadTxtVectors(textFile); - WordVectors vectorsStatic = WordVectorSerializer.loadStaticModel(textFile); - - INDArray arrayLive = vectorsLive.getWordVectorMatrix("Morgan_Freeman"); - INDArray arrayStatic = vectorsStatic.getWordVectorMatrix("Morgan_Freeman"); - - assertNotEquals(null, arrayLive); - assertEquals(arrayLive, arrayStatic); - } - - /** - * This method tests ZIP file loading as static model - * - * @throws Exception - */ - @Test - @Ignore("AB 2019/06/24 - Failing: Ignored to get to all passing baseline to prevent regressions via CI - see issue #7912") - public void testStaticLoaderArchive() throws Exception { - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - File w2v = new ClassPathResource("word2vec.dl4j/file.w2v").getFile(); - - WordVectors vectorsLive = WordVectorSerializer.readWord2Vec(w2v); - WordVectors vectorsStatic = WordVectorSerializer.loadStaticModel(w2v); - - INDArray arrayLive = vectorsLive.getWordVectorMatrix("night"); - INDArray arrayStatic = vectorsStatic.getWordVectorMatrix("night"); - - assertNotEquals(null, arrayLive); - assertEquals(arrayLive, arrayStatic); - } - - @Test - public void testUnifiedLoaderArchive1() throws Exception { - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - File w2v = new ClassPathResource("word2vec.dl4j/file.w2v").getFile(); - - WordVectors vectorsLive = WordVectorSerializer.readWord2Vec(w2v); - WordVectors vectorsUnified = WordVectorSerializer.readWord2VecModel(w2v, false); - - INDArray arrayLive = vectorsLive.getWordVectorMatrix("night"); - INDArray arrayStatic = vectorsUnified.getWordVectorMatrix("night"); - - assertNotEquals(null, arrayLive); - assertEquals(arrayLive, arrayStatic); - - assertEquals(null, ((InMemoryLookupTable) vectorsUnified.lookupTable()).getSyn1()); - assertEquals(null, ((InMemoryLookupTable) vectorsUnified.lookupTable()).getSyn1Neg()); - } - - @Test - public void testUnifiedLoaderArchive2() throws Exception { - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - File w2v = new ClassPathResource("word2vec.dl4j/file.w2v").getFile(); - - WordVectors vectorsLive = WordVectorSerializer.readWord2Vec(w2v); - WordVectors vectorsUnified = WordVectorSerializer.readWord2VecModel(w2v, true); - - INDArray arrayLive = vectorsLive.getWordVectorMatrix("night"); - INDArray arrayStatic = vectorsUnified.getWordVectorMatrix("night"); - - assertNotEquals(null, arrayLive); - assertEquals(arrayLive, arrayStatic); - - assertNotEquals(null, ((InMemoryLookupTable) vectorsUnified.lookupTable()).getSyn1()); - } - - /** - * This method tests CSV file loading via unified loader - * - * @throws Exception - */ - @Test - public void testUnifiedLoaderText() throws Exception { - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - WordVectors vectorsLive = WordVectorSerializer.loadTxtVectors(textFile); - WordVectors vectorsUnified = WordVectorSerializer.readWord2VecModel(textFile, true); - - INDArray arrayLive = vectorsLive.getWordVectorMatrix("Morgan_Freeman"); - INDArray arrayStatic = vectorsUnified.getWordVectorMatrix("Morgan_Freeman"); - - assertNotEquals(null, arrayLive); - assertEquals(arrayLive, arrayStatic); - - // we're trying EXTENDED model, but file doesn't have syn1/huffman info, so it should be silently degraded to simplified model - assertEquals(null, ((InMemoryLookupTable) vectorsUnified.lookupTable()).getSyn1()); - } - - /** - * This method tests binary file loading via unified loader - * - * @throws Exception - */ - @Test - public void testUnifiedLoaderBinary() throws Exception { - - logger.info("Executor name: {}", Nd4j.getExecutioner().getClass().getSimpleName()); - - WordVectors vectorsLive = WordVectorSerializer.readWord2VecModel(binaryFile); - WordVectors vectorsStatic = WordVectorSerializer.readWord2VecModel(binaryFile, false); - - INDArray arrayLive = vectorsLive.getWordVectorMatrix("Morgan_Freeman"); - INDArray arrayStatic = vectorsStatic.getWordVectorMatrix("Morgan_Freeman"); - - assertNotEquals(null, arrayLive); - assertEquals(arrayLive, arrayStatic); - } - - @Ignore - @Test - public void testBiggerParavecLoader() throws Exception { - ParagraphVectors vectors = - WordVectorSerializer.readParagraphVectors("C:\\Users\\raver\\Downloads\\10kNews.zip"); - } - - @Test - public void testVocabPeristence() throws Exception { - val vocabA = new AbstractCache.Builder().build(); - - vocabA.addToken(new VocabWord(3.0, "alpha")); - vocabA.addWordToIndex(1, "alpha"); - - vocabA.addToken(new VocabWord(4.0, "beta")); - vocabA.addWordToIndex(0, "beta"); - - val tmpFile = File.createTempFile("sdsds","sfdsfdsgsdf"); - tmpFile.deleteOnExit(); - - vocabA.setTotalWordOccurences(200); - vocabA.incrementTotalDocCount(100); - - assertEquals(100, vocabA.totalNumberOfDocs()); - assertEquals(200, vocabA.totalWordOccurrences()); - - WordVectorSerializer.writeVocabCache(vocabA, tmpFile); - - val vocabB = WordVectorSerializer.readVocabCache(tmpFile); - - assertEquals(vocabA.wordAtIndex(0), vocabB.wordAtIndex(0)); - assertEquals(vocabA.wordAtIndex(1), vocabB.wordAtIndex(1)); - - assertEquals(vocabA.numWords(), vocabB.numWords()); - assertEquals(vocabA.totalNumberOfDocs(), vocabB.totalNumberOfDocs()); - assertEquals(vocabA.totalWordOccurrences(), vocabB.totalWordOccurrences()); - } - - @Test - public void testMalformedLabels1() throws Exception { - List words = new ArrayList<>(); - words.add("test A"); - words.add("test B"); - words.add("test\nC"); - words.add("test`D"); - words.add("test_E"); - words.add("test 5"); - - AbstractCache vocabCache = new AbstractCache<>(); - int cnt = 0; - for (String word : words) { - vocabCache.addToken(new VocabWord(1.0, word)); - vocabCache.addWordToIndex(cnt, word); - cnt++; - } - - vocabCache.elementAtIndex(1).markAsLabel(true); - - InMemoryLookupTable lookupTable = - new InMemoryLookupTable<>(vocabCache, 10, false, 0.01, Nd4j.getRandom(), 0.0); - lookupTable.resetWeights(true); - - assertNotEquals(null, lookupTable.getSyn0()); - assertNotEquals(null, lookupTable.getSyn1()); - assertNotEquals(null, lookupTable.getExpTable()); - assertEquals(null, lookupTable.getSyn1Neg()); - - ParagraphVectors vec = new ParagraphVectors.Builder().lookupTable(lookupTable).vocabCache(vocabCache).build(); - - File tempFile = File.createTempFile("temp", "w2v"); - tempFile.deleteOnExit(); - - WordVectorSerializer.writeParagraphVectors(vec, tempFile); - - - ParagraphVectors restoredVec = WordVectorSerializer.readParagraphVectors(tempFile); - - for (String word : words) { - assertEquals(true, restoredVec.hasWord(word)); - } - - assertTrue(restoredVec.getVocab().elementAtIndex(1).isLabel()); - } - - @Test - public void testB64_1() throws Exception { - String wordA = "night"; - String wordB = "night day"; - String encA = WordVectorSerializer.ReadHelper.encodeB64(wordA); - String encB = WordVectorSerializer.ReadHelper.encodeB64(wordB); - - assertEquals(wordA, WordVectorSerializer.ReadHelper.decodeB64(encA)); - assertEquals(wordB, WordVectorSerializer.ReadHelper.decodeB64(encB)); - - assertEquals(wordA, WordVectorSerializer.ReadHelper.decodeB64(wordA)); - assertEquals(wordB, WordVectorSerializer.ReadHelper.decodeB64(wordB)); - - } - - @Test - public void testFastText() { - File[] files = { fastTextRaw, fastTextZip, fastTextGzip }; - for (File file : files) { - try { - Word2Vec word2Vec = WordVectorSerializer.readAsCsv(file); - assertEquals(99, word2Vec.getVocab().numWords()); - } catch (Exception readCsvException) { - fail("Failure for input file " + file.getAbsolutePath() + " " + readCsvException.getMessage()); - } - } - } - - @Test - public void testFastText_readWord2VecModel() { - File[] files = { fastTextRaw, fastTextZip, fastTextGzip }; - for (File file : files) { - try { - Word2Vec word2Vec = WordVectorSerializer.readWord2VecModel(file); - assertEquals(99, word2Vec.getVocab().numWords()); - } catch (Exception readCsvException) { - fail("Failure for input file " + file.getAbsolutePath() + " " + readCsvException.getMessage()); - } - } - } - - @Test - public void testBackwardsCompatibleWord2Vec() { - File model_v3 = Resources.asFile("deeplearning4j-nlp/model_beta3.zip"); - File model_v4 = Resources.asFile("deeplearning4j-nlp/model_beta4.zip"); - Word2Vec word2Vec1 = WordVectorSerializer.readWord2VecModel(model_v3, true); - Word2Vec word2Vec2 = WordVectorSerializer.readWord2VecModel(model_v4, true); - try { - assertEquals(word2Vec1.toJson(), word2Vec2.toJson()); - } catch (Exception e) { - fail(e.getMessage()); - } - } - - @Test - public void testBackwardsCompatibleSequenceVectors() { - File model_v3 = Resources.asFile("deeplearning4j-nlp/seqv_beta3.csv"); - File model_v4 = Resources.asFile("deeplearning4j-nlp/seqv_beta4.csv"); - try { - SequenceVectors vectors1 = WordVectorSerializer.readSequenceVectors(new VocabWordFactory(), model_v3); - SequenceVectors vectors2 = WordVectorSerializer.readSequenceVectors(new VocabWordFactory(), model_v4); - - assertEquals(vectors1.vocab().numWords(), vectors2.vocab().numWords()); - for (int i = 0; i < vectors1.vocab().numWords(); ++i) { - assertEquals(vectors1.vocab().words().toArray()[i], vectors2.vocab().words().toArray()[i]); - } - } catch (Exception e) { - fail(e.getMessage()); - } - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/embeddings/loader/VectorsConfigurationTest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/embeddings/loader/VectorsConfigurationTest.java deleted file mode 100644 index 880a93eb6..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/embeddings/loader/VectorsConfigurationTest.java +++ /dev/null @@ -1,87 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.models.embeddings.loader; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.models.word2vec.Word2Vec; -import org.deeplearning4j.text.sentenceiterator.SentenceIterator; -import org.deeplearning4j.nlp.uima.sentenceiterator.UimaSentenceIterator; -import org.junit.Before; -import org.junit.Test; -import org.nd4j.common.resources.Resources; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.io.File; - -import static org.junit.Assert.assertEquals; - -/** - * Created by fartovii on 21.11.15. - */ -public class VectorsConfigurationTest extends BaseDL4JTest { - - protected static final Logger log = LoggerFactory.getLogger(VectorsConfigurationTest.class); - - @Before - public void setUp() throws Exception { - - } - - @Test - public void testFromJson() throws Exception { - VectorsConfiguration configuration = new VectorsConfiguration(); - configuration.setHugeModelExpected(true); - configuration.setWindow(5); - configuration.setIterations(3); - configuration.setLayersSize(200); - configuration.setLearningRate(1.4d); - configuration.setSampling(0.0005d); - configuration.setMinLearningRate(0.25d); - configuration.setEpochs(1); - - String json = configuration.toJson(); - log.info("Conf. JSON: " + json); - VectorsConfiguration configuration2 = VectorsConfiguration.fromJson(json); - - assertEquals(configuration, configuration2); - } - - @Test(timeout = 300000) - public void testFromW2V() throws Exception { - VectorsConfiguration configuration = new VectorsConfiguration(); - configuration.setHugeModelExpected(true); - configuration.setWindow(5); - configuration.setIterations(3); - configuration.setLayersSize(200); - configuration.setLearningRate(1.4d); - configuration.setSampling(0.0005d); - configuration.setMinLearningRate(0.25d); - configuration.setEpochs(1); - - File inputFile = Resources.asFile("big/raw_sentences.txt"); - SentenceIterator iter = UimaSentenceIterator.createWithPath(inputFile.getAbsolutePath()); - - Word2Vec vec = new Word2Vec.Builder(configuration).iterate(iter).build(); - - VectorsConfiguration configuration2 = vec.getConfiguration(); - - assertEquals(configuration, configuration2); - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/word2vec/Word2VecTests.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/word2vec/Word2VecTests.java deleted file mode 100644 index 33034c449..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/models/word2vec/Word2VecTests.java +++ /dev/null @@ -1,967 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.models.word2vec; - -import lombok.extern.slf4j.Slf4j; -import org.apache.commons.io.IOUtils; -import org.apache.commons.io.LineIterator; -import org.deeplearning4j.text.sentenceiterator.CollectionSentenceIterator; -import org.junit.Rule; -import org.junit.rules.Timeout; -import org.nd4j.shade.guava.primitives.Doubles; -import org.nd4j.shade.guava.primitives.Ints; -import lombok.val; -import org.apache.commons.io.FileUtils; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.models.embeddings.inmemory.InMemoryLookupTable; -import org.deeplearning4j.models.embeddings.loader.VectorsConfiguration; -import org.deeplearning4j.models.word2vec.wordstore.inmemory.AbstractCache; -import org.nd4j.common.io.ClassPathResource; -import org.deeplearning4j.models.embeddings.learning.impl.elements.CBOW; -import org.deeplearning4j.models.embeddings.learning.impl.elements.SkipGram; -import org.deeplearning4j.models.embeddings.loader.WordVectorSerializer; -import org.deeplearning4j.models.embeddings.reader.impl.BasicModelUtils; -import org.deeplearning4j.models.embeddings.reader.impl.FlatModelUtils; -import org.deeplearning4j.models.embeddings.wordvectors.WordVectors; -import org.deeplearning4j.text.sentenceiterator.BasicLineIterator; -import org.deeplearning4j.text.sentenceiterator.SentenceIterator; -import org.deeplearning4j.nlp.uima.sentenceiterator.UimaSentenceIterator; -import org.deeplearning4j.text.tokenization.tokenizer.preprocessor.CommonPreprocessor; -import org.deeplearning4j.text.tokenization.tokenizerfactory.DefaultTokenizerFactory; -import org.deeplearning4j.text.tokenization.tokenizerfactory.TokenizerFactory; -import org.junit.Before; -import org.junit.Ignore; -import org.junit.Test; -import org.nd4j.linalg.api.ndarray.INDArray; -import org.nd4j.linalg.factory.Nd4j; -import org.nd4j.linalg.ops.transforms.Transforms; -import org.nd4j.common.resources.Resources; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.io.*; -import java.nio.charset.StandardCharsets; -import java.util.*; - -import static org.junit.Assert.*; - - -/** - * @author jeffreytang - */ -@Slf4j -public class Word2VecTests extends BaseDL4JTest { - - private static final Logger log = LoggerFactory.getLogger(Word2VecTests.class); - - private File inputFile; - private File inputFile2; - private String pathToWriteto; - private WordVectors googleModel; - - @Rule - public Timeout timeout = Timeout.seconds(300); - - @Before - public void before() throws Exception { - File googleModelTextFile = new ClassPathResource("word2vecserialization/google_news_30.txt").getFile(); - googleModel = WordVectorSerializer.readWord2VecModel(googleModelTextFile); - inputFile = Resources.asFile("big/raw_sentences.txt"); - inputFile2 = Resources.asFile("big/raw_sentences_2.txt"); - - File ptwt = new File(System.getProperty("java.io.tmpdir"), "testing_word2vec_serialization.txt"); - - pathToWriteto = ptwt.getAbsolutePath(); - - - - FileUtils.deleteDirectory(new File("word2vec-index")); - } - - @Test - public void testGoogleModelLoaded() throws Exception { - assertEquals(googleModel.vocab().numWords(), 30); - assertTrue(googleModel.hasWord("Morgan_Freeman")); - double[] wordVector = googleModel.getWordVector("Morgan_Freeman"); - assertTrue(wordVector.length == 300); - assertEquals(Doubles.asList(wordVector).get(0), 0.044423, 1e-3); - } - - @Test - public void testSimilarity() throws Exception { - testGoogleModelLoaded(); - assertEquals(googleModel.similarity("Benkovic", "Boeremag_trialists"), 0.1204, 1e-2); - assertEquals(googleModel.similarity("Benkovic", "Gopie"), 0.3350, 1e-2); - assertEquals(googleModel.similarity("Benkovic", "Youku.com"), 0.0116, 1e-2); - } - - @Test - public void testWordsNearest() throws Exception { - testGoogleModelLoaded(); - List lst = Arrays.asList(googleModel.wordsNearest("Benkovic", 10).toArray()); - - assertTrue(lst.contains("Gopie")); - assertTrue(lst.contains("JIM_HOOK_Senior")); - /* - assertEquals(lst.get(0), "Gopie"); - assertEquals(lst.get(1), "JIM_HOOK_Senior"); - */ - } - - @Test - public void testUIMAIterator() throws Exception { - SentenceIterator iter = UimaSentenceIterator.createWithPath(inputFile.getAbsolutePath()); - assertEquals(iter.nextSentence(), "No , he says now ."); - } - - @Test - @Ignore // no adagrad these days - public void testWord2VecAdaGrad() throws Exception { - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(5).iterations(5).learningRate(0.025).layerSize(100) - .seed(42).batchSize(13500).sampling(0).negativeSample(0) - //.epochs(10) - .windowSize(5).modelUtils(new BasicModelUtils()).useAdaGrad(false) - .useHierarchicSoftmax(true).iterate(iter).workers(4).tokenizerFactory(t).build(); - - vec.fit(); - - Collection lst = vec.wordsNearest("day", 10); - log.info(Arrays.toString(lst.toArray())); - - // assertEquals(10, lst.size()); - - double sim = vec.similarity("day", "night"); - log.info("Day/night similarity: " + sim); - - assertTrue(lst.contains("week")); - assertTrue(lst.contains("night")); - assertTrue(lst.contains("year")); - } - - @Test - public void testWord2VecCBOW() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(1).iterations(1).learningRate(0.025).layerSize(150) - .seed(42).sampling(0).negativeSample(0).useHierarchicSoftmax(true).windowSize(5) - .modelUtils(new BasicModelUtils()).useAdaGrad(false).iterate(iter).workers(4) - .tokenizerFactory(t).elementsLearningAlgorithm(new CBOW()).build(); - - vec.fit(); - - Collection lst = vec.wordsNearest("day", 10); - log.info(Arrays.toString(lst.toArray())); - - // assertEquals(10, lst.size()); - - double sim = vec.similarity("day", "night"); - log.info("Day/night similarity: " + sim); - - assertTrue(lst.contains("week")); - assertTrue(lst.contains("night")); - assertTrue(lst.contains("year")); - assertTrue(sim > 0.65f); - } - - - @Test - public void testWord2VecMultiEpoch() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - SentenceIterator iter; - if(isIntegrationTests()){ - iter = new BasicLineIterator(inputFile.getAbsolutePath()); - } else { - iter = new CollectionSentenceIterator(firstNLines(inputFile, 50000)); - } - - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(1).iterations(5).learningRate(0.025).layerSize(150) - .seed(42).sampling(0).negativeSample(0).useHierarchicSoftmax(true).windowSize(5).epochs(3) - .modelUtils(new BasicModelUtils()).useAdaGrad(false).iterate(iter).workers(8) - .tokenizerFactory(t).elementsLearningAlgorithm(new CBOW()).build(); - - vec.fit(); - - Collection lst = vec.wordsNearest("day", 10); - log.info(Arrays.toString(lst.toArray())); - - // assertEquals(10, lst.size()); - - double sim = vec.similarity("day", "night"); - log.info("Day/night similarity: " + sim); - - assertTrue(lst.contains("week")); - assertTrue(lst.contains("night")); - assertTrue(lst.contains("year")); - } - - @Test - public void reproducibleResults_ForMultipleRuns() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - log.info("reproducibleResults_ForMultipleRuns"); - val shakespear = new ClassPathResource("big/rnj.txt"); - val basic = new ClassPathResource("big/rnj.txt"); - SentenceIterator iter = new BasicLineIterator(inputFile); - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec1 = new Word2Vec.Builder().minWordFrequency(1).iterations(1).batchSize(8192).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new SkipGram()) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(1) - .useHierarchicSoftmax(true) - .modelUtils(new BasicModelUtils()).iterate(iter).tokenizerFactory(t).build(); - - Word2Vec vec2 = new Word2Vec.Builder().minWordFrequency(1).iterations(1).batchSize(8192).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new SkipGram()) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(1) - .useHierarchicSoftmax(true) - .modelUtils(new BasicModelUtils()).iterate(iter).tokenizerFactory(t).build(); - - vec1.fit(); - - iter.reset(); - - vec2.fit(); - - for (int e = 0; e < vec1.getVocab().numWords(); e++) { - val w1 = vec1.getVocab().elementAtIndex(e); - val w2 = vec2.getVocab().elementAtIndex(e); - - assertNotNull(w1); - assertNotNull(w2); - - assertEquals(w1.getLabel(), w2.getLabel()); - - assertArrayEquals("Failed for token [" + w1.getLabel() + "] at index [" + e + "]", Ints.toArray(w1.getPoints()), Ints.toArray(w2.getPoints())); - assertArrayEquals("Failed for token [" + w1.getLabel() + "] at index [" + e + "]", Ints.toArray(w1.getCodes()), Ints.toArray(w2.getCodes())); - } - - val syn0_from_vec1 = ((InMemoryLookupTable) vec1.getLookupTable()).getSyn0(); - val syn0_from_vec2 = ((InMemoryLookupTable) vec2.getLookupTable()).getSyn0(); - - assertEquals(syn0_from_vec1, syn0_from_vec2); - - log.info("Day/night similarity: {}", vec1.similarity("day", "night")); - val result = vec1.wordsNearest("day", 10); - printWords("day", result, vec1); - } - - @Test - public void testRunWord2Vec() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - // Strip white space before and after for each line - /*val shakespear = new ClassPathResource("big/rnj.txt"); - SentenceIterator iter = new BasicLineIterator(shakespear.getFile());*/ - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(1).iterations(1).batchSize(8192).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new SkipGram()) - //.negativeSample(10) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(6) - .usePreciseMode(true) - .modelUtils(new BasicModelUtils()).iterate(iter).tokenizerFactory(t).build(); - - assertEquals(new ArrayList(), vec.getStopWords()); - vec.fit(); - File tempFile = File.createTempFile("temp", "temp"); - tempFile.deleteOnExit(); - - WordVectorSerializer.writeFullModel(vec, tempFile.getAbsolutePath()); - Collection lst = vec.wordsNearest("day", 10); - //log.info(Arrays.toString(lst.toArray())); - printWords("day", lst, vec); - - assertEquals(10, lst.size()); - - double sim = vec.similarity("day", "night"); - log.info("Day/night similarity: " + sim); - - assertTrue(sim < 1.0); - assertTrue(sim > 0.4); - - - assertTrue(lst.contains("week")); - assertTrue(lst.contains("night")); - assertTrue(lst.contains("year")); - - assertFalse(lst.contains(null)); - - - lst = vec.wordsNearest("day", 10); - //log.info(Arrays.toString(lst.toArray())); - printWords("day", lst, vec); - - assertTrue(lst.contains("week")); - assertTrue(lst.contains("night")); - assertTrue(lst.contains("year")); - - new File("cache.ser").delete(); - - ArrayList labels = new ArrayList<>(); - labels.add("day"); - labels.add("night"); - labels.add("week"); - - INDArray matrix = vec.getWordVectors(labels); - assertEquals(matrix.getRow(0, true), vec.getWordVectorMatrix("day")); - assertEquals(matrix.getRow(1, true), vec.getWordVectorMatrix("night")); - assertEquals(matrix.getRow(2, true), vec.getWordVectorMatrix("week")); - - WordVectorSerializer.writeWordVectors(vec, pathToWriteto); - } - - /** - * Adding test for cosine similarity, to track changes in Transforms.cosineSim() - */ - @Test - public void testCosineSim() { - double[] array1 = new double[] {1.01, 0.91, 0.81, 0.71}; - double[] array2 = new double[] {1.01, 0.91, 0.81, 0.71}; - double[] array3 = new double[] {1.0, 0.9, 0.8, 0.7}; - - double sim12 = Transforms.cosineSim(Nd4j.create(array1), Nd4j.create(array2)); - double sim23 = Transforms.cosineSim(Nd4j.create(array2), Nd4j.create(array3)); - log.info("Arrays 1/2 cosineSim: " + sim12); - log.info("Arrays 2/3 cosineSim: " + sim23); - log.info("Arrays 1/2 dot: " + Nd4j.getBlasWrapper().dot(Nd4j.create(array1), Nd4j.create(array2))); - log.info("Arrays 2/3 dot: " + Nd4j.getBlasWrapper().dot(Nd4j.create(array2), Nd4j.create(array3))); - - assertEquals(1.0d, sim12, 0.01d); - assertEquals(0.99d, sim23, 0.01d); - } - - @Test - public void testLoadingWordVectors() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - File modelFile = new File(pathToWriteto); - if (!modelFile.exists()) { - testRunWord2Vec(); - } - WordVectors wordVectors = WordVectorSerializer.loadTxtVectors(modelFile); - Collection lst = wordVectors.wordsNearest("day", 10); - System.out.println(Arrays.toString(lst.toArray())); - } - - @Ignore - @Test - public void testWord2VecGoogleModelUptraining() throws Exception { - long time1 = System.currentTimeMillis(); - Word2Vec vec = WordVectorSerializer.readWord2VecModel( - new File("C:\\Users\\raver\\Downloads\\GoogleNews-vectors-negative300.bin.gz"), false); - long time2 = System.currentTimeMillis(); - log.info("Model loaded in {} msec", time2 - time1); - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - vec.setTokenizerFactory(t); - vec.setSentenceIterator(iter); - vec.getConfiguration().setUseHierarchicSoftmax(false); - vec.getConfiguration().setNegative(5.0); - vec.setElementsLearningAlgorithm(new CBOW()); - - vec.fit(); - } - - @Test - public void testW2VnegativeOnRestore() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - // Strip white space before and after for each line - SentenceIterator iter; - if(isIntegrationTests()){ - iter = new BasicLineIterator(inputFile.getAbsolutePath()); - } else { - iter = new CollectionSentenceIterator(firstNLines(inputFile, 300)); - } - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(1).iterations(3).batchSize(8192).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new SkipGram()).negativeSample(10).epochs(1) - .windowSize(5).useHierarchicSoftmax(false).allowParallelTokenization(true) - .modelUtils(new FlatModelUtils()).iterate(iter).tokenizerFactory(t).build(); - - - assertEquals(false, vec.getConfiguration().isUseHierarchicSoftmax()); - - log.info("Fit 1"); - vec.fit(); - - File tmpFile = File.createTempFile("temp", "file"); - tmpFile.deleteOnExit(); - - WordVectorSerializer.writeWord2VecModel(vec, tmpFile); - - iter.reset(); - - Word2Vec restoredVec = WordVectorSerializer.readWord2VecModel(tmpFile, true); - restoredVec.setTokenizerFactory(t); - restoredVec.setSentenceIterator(iter); - - assertEquals(false, restoredVec.getConfiguration().isUseHierarchicSoftmax()); - assertTrue(restoredVec.getModelUtils() instanceof FlatModelUtils); - assertTrue(restoredVec.getConfiguration().isAllowParallelTokenization()); - - log.info("Fit 2"); - restoredVec.fit(); - - - iter.reset(); - restoredVec = WordVectorSerializer.readWord2VecModel(tmpFile, false); - restoredVec.setTokenizerFactory(t); - restoredVec.setSentenceIterator(iter); - - assertEquals(false, restoredVec.getConfiguration().isUseHierarchicSoftmax()); - assertTrue(restoredVec.getModelUtils() instanceof BasicModelUtils); - - log.info("Fit 3"); - restoredVec.fit(); - } - - @Test - public void testUnknown1() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - // Strip white space before and after for each line - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(10).useUnknown(true) - .unknownElement(new VocabWord(1.0, "PEWPEW")).iterations(1).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new CBOW()).epochs(1).windowSize(5) - .useHierarchicSoftmax(true).allowParallelTokenization(true) - .modelUtils(new FlatModelUtils()).iterate(iter).tokenizerFactory(t).build(); - - vec.fit(); - - assertTrue(vec.hasWord("PEWPEW")); - assertTrue(vec.getVocab().containsWord("PEWPEW")); - - INDArray unk = vec.getWordVectorMatrix("PEWPEW"); - assertNotEquals(null, unk); - - File tempFile = File.createTempFile("temp", "file"); - tempFile.deleteOnExit(); - - WordVectorSerializer.writeWord2VecModel(vec, tempFile); - - log.info("Original configuration: {}", vec.getConfiguration()); - - Word2Vec restored = WordVectorSerializer.readWord2VecModel(tempFile); - - assertTrue(restored.hasWord("PEWPEW")); - assertTrue(restored.getVocab().containsWord("PEWPEW")); - INDArray unk_restored = restored.getWordVectorMatrix("PEWPEW"); - - assertEquals(unk, unk_restored); - - - - // now we're getting some junk word - INDArray random = vec.getWordVectorMatrix("hhsd7d7sdnnmxc_SDsda"); - INDArray randomRestored = restored.getWordVectorMatrix("hhsd7d7sdnnmxc_SDsda"); - - log.info("Restored configuration: {}", restored.getConfiguration()); - - assertEquals(unk, random); - assertEquals(unk, randomRestored); - } - - @Test - public void orderIsCorrect_WhenParallelized() throws Exception { - // Strip white space before and after for each line - SentenceIterator iter; - if(isIntegrationTests()){ - iter = new BasicLineIterator(inputFile.getAbsolutePath()); - } else { - iter = new CollectionSentenceIterator(firstNLines(inputFile, 300)); - } - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(1).iterations(3).batchSize(64).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new SkipGram()) - //.negativeSample(10) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(1) - .modelUtils(new BasicModelUtils()).iterate(iter).tokenizerFactory(t).build(); - - - vec.fit(); - System.out.println(vec.getVocab().numWords()); - - val words = vec.getVocab().words(); - assertTrue(words.size() > 0); -// for (val word : words) { -// System.out.println(word); -// } - } - - @Test - public void testJSONSerialization() { - Word2Vec word2Vec = new Word2Vec.Builder() - .layerSize(1000) - .limitVocabularySize(1000) - .elementsLearningAlgorithm(CBOW.class.getCanonicalName()) - .allowParallelTokenization(true) - .modelUtils(new FlatModelUtils()) - .usePreciseMode(true) - .batchSize(1024) - .windowSize(23) - .minWordFrequency(24) - .iterations(54) - .seed(45) - .learningRate(0.08) - .epochs(45) - .stopWords(Collections.singletonList("NOT")) - .sampling(44) - .workers(45) - .negativeSample(56) - .useAdaGrad(true) - .useHierarchicSoftmax(false) - .minLearningRate(0.002) - .resetModel(true) - .useUnknown(true) - .enableScavenger(true) - .usePreciseWeightInit(true) - .build(); - - - AbstractCache cache = new AbstractCache.Builder().build(); - - val words = new VocabWord[3]; - words[0] = new VocabWord(1.0, "word"); - words[1] = new VocabWord(2.0, "test"); - words[2] = new VocabWord(3.0, "tester"); - - for (int i = 0; i < words.length; ++i) { - cache.addToken(words[i]); - cache.addWordToIndex(i, words[i].getLabel()); - } - word2Vec.setVocab(cache); - - String json = null; - Word2Vec unserialized = null; - try { - json = word2Vec.toJson(); - log.info("{}", json.toString()); - - unserialized = Word2Vec.fromJson(json); - } - catch (Exception e) { - log.error("",e); - fail(); - } - - assertEquals(cache.totalWordOccurrences(),((Word2Vec) unserialized).getVocab().totalWordOccurrences()); - assertEquals(cache.totalNumberOfDocs(), ((Word2Vec) unserialized).getVocab().totalNumberOfDocs()); - - for (int i = 0; i < words.length; ++i) { - val cached = cache.wordAtIndex(i); - val restored = ((Word2Vec) unserialized).getVocab().wordAtIndex(i); - assertNotNull(cached); - assertEquals(cached, restored); - } - } - - @Test - public void testWord2VecConfigurationConsistency() { - VectorsConfiguration configuration = new VectorsConfiguration(); - - assertEquals(configuration.getLayersSize(), 200); - assertEquals(configuration.getLayersSize(), 200); - assert(configuration.getElementsLearningAlgorithm() == null); - assertEquals(configuration.isAllowParallelTokenization(), false); - assertEquals(configuration.isPreciseMode(), false); - assertEquals(configuration.getBatchSize(), 512); - assert(configuration.getModelUtils() == null); - assertTrue(!configuration.isPreciseMode()); - assertEquals(configuration.getBatchSize(), 512); - assertEquals(configuration.getWindow(), 5); - assertEquals(configuration.getMinWordFrequency(), 5); - assertEquals(configuration.getIterations(), 1); - assertEquals(configuration.getSeed(), 0); - assertEquals(configuration.getLearningRate(), 0.025, 1e-5f); - assertEquals(configuration.getEpochs(), 1); - assertTrue(configuration.getStopList().isEmpty()); - assertEquals(configuration.getSampling(), 0.0, 1e-5f); - assertEquals(configuration.getNegative(), 0, 1e-5f); - assertTrue(!configuration.isUseAdaGrad()); - assertTrue(configuration.isUseHierarchicSoftmax()); - assertEquals(configuration.getMinLearningRate(), 1.0E-4, 1e-5f); - assertTrue(!configuration.isUseUnknown()); - - - Word2Vec word2Vec = new Word2Vec.Builder(configuration) - .layerSize(1000) - .limitVocabularySize(1000) - .elementsLearningAlgorithm(CBOW.class.getCanonicalName()) - .allowParallelTokenization(true) - .modelUtils(new FlatModelUtils()) - .usePreciseMode(true) - .batchSize(1024) - .windowSize(23) - .minWordFrequency(24) - .iterations(54) - .seed(45) - .learningRate(0.08) - .epochs(45) - .stopWords(Collections.singletonList("NOT")) - .sampling(44) - .workers(45) - .negativeSample(56) - .useAdaGrad(true) - .useHierarchicSoftmax(false) - .minLearningRate(0.002) - .resetModel(true) - .useUnknown(true) - .enableScavenger(true) - .usePreciseWeightInit(true) - .build(); - - assertEquals(word2Vec.getConfiguration().getLayersSize(), word2Vec.getLayerSize()); - assertEquals(word2Vec.getConfiguration().getLayersSize(), 1000); - assertEquals(word2Vec.getConfiguration().getElementsLearningAlgorithm(), CBOW.class.getCanonicalName()); - assertEquals(word2Vec.getConfiguration().isAllowParallelTokenization(), true); - assertEquals(word2Vec.getConfiguration().isPreciseMode(), true); - assertEquals(word2Vec.getConfiguration().getBatchSize(), 1024); - - String modelUtilsName = word2Vec.getConfiguration().getModelUtils(); - assertEquals(modelUtilsName, FlatModelUtils.class.getCanonicalName()); - - assertTrue(word2Vec.getConfiguration().isPreciseMode()); - assertEquals(word2Vec.getConfiguration().getBatchSize(), 1024); - - assertEquals(word2Vec.getConfiguration().getWindow(), 23); - assertEquals(word2Vec.getConfiguration().getMinWordFrequency(), 24); - assertEquals(word2Vec.getConfiguration().getIterations(), 54); - assertEquals(word2Vec.getConfiguration().getSeed(), 45); - assertEquals(word2Vec.getConfiguration().getLearningRate(), 0.08, 1e-5f); - assertEquals(word2Vec.getConfiguration().getEpochs(), 45); - - assertEquals(word2Vec.getConfiguration().getStopList().size(), 1); - - assertEquals(configuration.getSampling(), 44.0, 1e-5f); - assertEquals(configuration.getNegative(), 56.0, 1e-5f); - assertTrue(configuration.isUseAdaGrad()); - assertTrue(!configuration.isUseHierarchicSoftmax()); - assertEquals(configuration.getMinLearningRate(), 0.002, 1e-5f); - assertTrue(configuration.isUseUnknown()); - } - - @Test - public void testWordVectorsPartiallyAbsentLabels() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(10).useUnknown(true) - .iterations(1).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new CBOW()).epochs(1).windowSize(5) - .useHierarchicSoftmax(true).allowParallelTokenization(true) - .useUnknown(false) - .modelUtils(new FlatModelUtils()).iterate(iter).tokenizerFactory(t).build(); - - vec.fit(); - - ArrayList labels = new ArrayList<>(); - labels.add("fewfew"); - labels.add("day"); - labels.add("night"); - labels.add("week"); - - INDArray matrix = vec.getWordVectors(labels); - assertEquals(3, matrix.rows()); - assertEquals(matrix.getRow(0, true), vec.getWordVectorMatrix("day")); - assertEquals(matrix.getRow(1, true), vec.getWordVectorMatrix("night")); - assertEquals(matrix.getRow(2, true), vec.getWordVectorMatrix("week")); - } - - - @Test - public void testWordVectorsAbsentLabels() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(10).useUnknown(true) - .iterations(1).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new CBOW()).epochs(1).windowSize(5) - .useHierarchicSoftmax(true).allowParallelTokenization(true) - .useUnknown(false) - .modelUtils(new FlatModelUtils()).iterate(iter).tokenizerFactory(t).build(); - - vec.fit(); - - ArrayList labels = new ArrayList<>(); - labels.add("fewfew"); - - INDArray matrix = vec.getWordVectors(labels); - assertTrue(matrix.isEmpty()); - } - - @Test - public void testWordVectorsAbsentLabels_WithUnknown() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - // Split on white spaces in the line to get words - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - Word2Vec vec = new Word2Vec.Builder().minWordFrequency(1).iterations(1).batchSize(8192).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new SkipGram()) - //.negativeSample(10) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(4) - .modelUtils(new BasicModelUtils()).iterate(iter).tokenizerFactory(t) - .useUnknown(true).unknownElement(new VocabWord(1, "UNKOWN")).build(); - - vec.fit(); - - ArrayList labels = new ArrayList<>(); - labels.add("bus"); - labels.add("car"); - - INDArray matrix = vec.getWordVectors(labels); - for (int i = 0; i < labels.size(); ++i) - assertEquals(matrix.getRow(i, true), vec.getWordVectorMatrix("UNKNOWN")); - } - - @Test - public void weightsNotUpdated_WhenLocked() throws Exception { - - boolean isIntegration = isIntegrationTests(); - SentenceIterator iter; - SentenceIterator iter2; - if(isIntegration){ - iter = new BasicLineIterator(inputFile); - iter2 = new BasicLineIterator(inputFile2.getAbsolutePath()); - } else { - iter = new CollectionSentenceIterator(firstNLines(inputFile, 300)); - iter2 = new CollectionSentenceIterator(firstNLines(inputFile2, 300)); - } - - Word2Vec vec1 = new Word2Vec.Builder().minWordFrequency(1).iterations(3).batchSize(64).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new SkipGram()) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(1) - .iterate(iter) - .modelUtils(new BasicModelUtils()).build(); - - vec1.fit(); - - Word2Vec vec2 = new Word2Vec.Builder().minWordFrequency(1).iterations(3).batchSize(32).layerSize(100) - .stopWords(new ArrayList()).seed(32).learningRate(0.021).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new SkipGram()) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(1) - .iterate(iter2) - .intersectModel(vec1, true) - .modelUtils(new BasicModelUtils()).build(); - - vec2.fit(); - - assertEquals(vec1.getWordVectorMatrix("put"), vec2.getWordVectorMatrix("put")); - assertEquals(vec1.getWordVectorMatrix("part"), vec2.getWordVectorMatrix("part")); - assertEquals(vec1.getWordVectorMatrix("made"), vec2.getWordVectorMatrix("made")); - assertEquals(vec1.getWordVectorMatrix("money"), vec2.getWordVectorMatrix("money")); - } - - @Test - public void weightsNotUpdated_WhenLocked_CBOW() throws Exception { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - SentenceIterator iter = new BasicLineIterator(inputFile.getAbsolutePath()); - - Word2Vec vec1 = new Word2Vec.Builder().minWordFrequency(1).iterations(1).batchSize(8192).layerSize(100) - .stopWords(new ArrayList()).seed(42).learningRate(0.025).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new CBOW()) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(1) - .iterate(iter) - .modelUtils(new BasicModelUtils()).build(); - - vec1.fit(); - - log.info("Fit 1 finished"); - - iter = new BasicLineIterator(inputFile2.getAbsolutePath()); - Word2Vec vec2 = new Word2Vec.Builder().minWordFrequency(1).iterations(1).batchSize(8192).layerSize(100) - .stopWords(new ArrayList()).seed(32).learningRate(0.021).minLearningRate(0.001) - .sampling(0).elementsLearningAlgorithm(new CBOW()) - .epochs(1).windowSize(5).allowParallelTokenization(true) - .workers(1) - .iterate(iter) - .intersectModel(vec1, true) - .modelUtils(new BasicModelUtils()).build(); - - vec2.fit(); - - log.info("Fit 2 finished"); - - assertEquals(vec1.getWordVectorMatrix("put"), vec2.getWordVectorMatrix("put")); - assertEquals(vec1.getWordVectorMatrix("part"), vec2.getWordVectorMatrix("part")); - assertEquals(vec1.getWordVectorMatrix("made"), vec2.getWordVectorMatrix("made")); - assertEquals(vec1.getWordVectorMatrix("money"), vec2.getWordVectorMatrix("money")); - } - - @Test - public void testWordsNearestSum() throws IOException { - String backend = Nd4j.getExecutioner().getEnvironmentInformation().getProperty("backend"); - if(!isIntegrationTests() && "CUDA".equalsIgnoreCase(backend)) { - skipUnlessIntegrationTests(); //AB 2020/02/06 Skip CUDA except for integration tests due to very slow test speed - > 5 minutes on Titan X - } - - log.info("Load & Vectorize Sentences...."); - SentenceIterator iter = new BasicLineIterator(inputFile); - TokenizerFactory t = new DefaultTokenizerFactory(); - t.setTokenPreProcessor(new CommonPreprocessor()); - - log.info("Building model...."); - Word2Vec vec = new Word2Vec.Builder() - .minWordFrequency(5) - .iterations(1) - .layerSize(100) - .seed(42) - .windowSize(5) - .iterate(iter) - .tokenizerFactory(t) - .build(); - - log.info("Fitting Word2Vec model...."); - vec.fit(); - log.info("Writing word vectors to text file...."); - log.info("Closest Words:"); - Collection lst = vec.wordsNearestSum("day", 10); - log.info("10 Words closest to 'day': {}", lst); - assertTrue(lst.contains("week")); - assertTrue(lst.contains("night")); - assertTrue(lst.contains("year")); - assertTrue(lst.contains("years")); - assertTrue(lst.contains("time")); - } - - private static void printWords(String target, Collection list, Word2Vec vec) { - System.out.println("Words close to [" + target + "]:"); - for (String word : list) { - double sim = vec.similarity(target, word); - System.out.print("'" + word + "': [" + sim + "]"); - } - System.out.print("\n"); - } - - public static List firstNLines(File f, int n){ - List lines = new ArrayList<>(); - try(InputStream is = new BufferedInputStream(new FileInputStream(f))){ - LineIterator lineIter = IOUtils.lineIterator(is, StandardCharsets.UTF_8); - try{ - for( int i=0; i()).useUnknown(true).windowSize(5).iterate(iter) - .tokenizerFactory(t).build(); - vec.fit(); - - } - } - - @Test - public void testLabeledExample() throws Exception { - - INDArray unk = vec.getWordVectorMatrix(Word2Vec.DEFAULT_UNK); - assertNotEquals(null, unk); - - unk = vec.getWordVectorMatrix("2131241sdasdas"); - assertNotEquals(null, unk); - - ClassPathResource resource = new ClassPathResource("/labeled/"); - File dir = testDir.newFolder(); - resource.copyDirectory(dir); - - Word2VecDataSetIterator iter = new Word2VecDataSetIterator(vec, - new LabelAwareFileSentenceIterator(null, dir), - Arrays.asList("negative", "positive", "neutral")); - DataSet next = iter.next(); - - } - -} - diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/sentenceiterator/SentenceIteratorTest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/sentenceiterator/SentenceIteratorTest.java deleted file mode 100644 index 50e1a2073..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/sentenceiterator/SentenceIteratorTest.java +++ /dev/null @@ -1,116 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.text.sentenceiterator; - -import org.apache.commons.io.FileUtils; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.nlp.uima.sentenceiterator.UimaSentenceIterator; -import org.junit.After; -import org.junit.Before; -import org.junit.Rule; -import org.junit.Test; -import org.junit.rules.TemporaryFolder; - -import java.io.File; -import java.util.Arrays; - -import static org.junit.Assert.*; - -/** - * Created by agibsonccc on 9/9/14. - */ -public class SentenceIteratorTest extends BaseDL4JTest { - - @Rule - public TemporaryFolder testDir = new TemporaryFolder(); - - public File testTxt; - public File testSingle; - public File testMulti; - - @Before - public void before() throws Exception { - testSingle = testDir.newFolder(); - testTxt = new File(testSingle, "test.txt"); - FileUtils.writeLines(testTxt, Arrays.asList("Hello", "My", "Name")); - - - testMulti = testDir.newFolder(); - for (int i = 0; i < 2; i++) { - File newTestFile = new File(testMulti, "testfile-" + i); - FileUtils.writeLines(newTestFile, Arrays.asList("Sentence 1.", "Sentence 2.", "Sentence 3.")); - - } - - } - - - @Test - public void testUimaSentenceIterator() throws Exception { - SentenceIterator multiIter = UimaSentenceIterator.createWithPath(testMulti.getAbsolutePath()); - SentenceIterator iter = UimaSentenceIterator.createWithPath(testSingle.getAbsolutePath()); - testMulti(multiIter, 1); - testMulti(iter, 1); - - } - - @Test - public void testFileSentenceIterator() throws Exception { - SentenceIterator iter = new FileSentenceIterator(testSingle); - SentenceIterator multiIter = new FileSentenceIterator(testMulti); - testSingle(iter); - testMulti(multiIter, 3); - - } - - - - public void testSingle(SentenceIterator iter) { - assertTrue(iter.hasNext()); - - String sentence = iter.nextSentence(); - assertTrue(iter.hasNext()); - assertEquals("Hello", sentence); - assertEquals("My", iter.nextSentence()); - assertEquals("Name", iter.nextSentence()); - assertFalse(iter.hasNext()); - - } - - public void testMulti(SentenceIterator iter, int expectedSentences) { - assertTrue(iter.hasNext()); - for (int i = 0; i < expectedSentences * 2; i++) { - iter.nextSentence(); - } - - assertFalse(iter.hasNext()); - - } - - @After - public void after() throws Exception { - File test = testSingle; - test.mkdir(); - FileUtils.deleteQuietly(test); - FileUtils.deleteQuietly(testMulti); - } - - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/sentenceiterator/UimaResultSetIteratorTest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/sentenceiterator/UimaResultSetIteratorTest.java deleted file mode 100644 index 3b3983073..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/sentenceiterator/UimaResultSetIteratorTest.java +++ /dev/null @@ -1,143 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.text.sentenceiterator; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.nlp.uima.sentenceiterator.UimaResultSetIterator; -import org.junit.Before; -import org.junit.Test; - -import java.sql.ResultSet; - -import static org.junit.Assert.assertEquals; -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; - -/** - * @author Brad Heap nzv8fan@gmail.com - */ -public class UimaResultSetIteratorTest extends BaseDL4JTest { - - @Before - public void setUp() throws Exception { - - } - - @Test - public void testSingleSentenceRow() throws Exception { - - // Setup a mock ResultSet object - ResultSet resultSetMock = mock(ResultSet.class); - - // when .next() is called, first time true, then false - when(resultSetMock.next()).thenReturn(true).thenReturn(false); - when(resultSetMock.getString("line")).thenReturn("The quick brown fox."); - - UimaResultSetIterator iterator = new UimaResultSetIterator(resultSetMock, "line"); - - int cnt = 0; - while (iterator.hasNext()) { - String line = iterator.nextSentence(); - cnt++; - } - - assertEquals(1, cnt); - - } - - @Test - public void testMultipleSentenceRow() throws Exception { - - // Setup a mock ResultSet object - ResultSet resultSetMock = mock(ResultSet.class); - - // when .next() is called, first time true, then false - when(resultSetMock.next()).thenReturn(true).thenReturn(false); - when(resultSetMock.getString("line")).thenReturn("The quick brown fox. The lazy dog. Over a fence."); - - UimaResultSetIterator iterator = new UimaResultSetIterator(resultSetMock, "line"); - - int cnt = 0; - while (iterator.hasNext()) { - String line = iterator.nextSentence(); - cnt++; - } - - assertEquals(3, cnt); - - } - - @Test - public void testMultipleSentencesAndMultipleRows() throws Exception { - - // Setup a mock ResultSet object - ResultSet resultSetMock = mock(ResultSet.class); - - // when .next() is called, first time true, then false - when(resultSetMock.next()).thenReturn(true).thenReturn(true).thenReturn(false); - when(resultSetMock.getString("line")).thenReturn("The quick brown fox.") - .thenReturn("The lazy dog. Over a fence."); - - UimaResultSetIterator iterator = new UimaResultSetIterator(resultSetMock, "line"); - - int cnt = 0; - while (iterator.hasNext()) { - String line = iterator.nextSentence(); - cnt++; - } - - assertEquals(3, cnt); - - } - - @Test - public void testMultipleSentencesAndMultipleRowsAndReset() throws Exception { - - // Setup a mock ResultSet object - ResultSet resultSetMock = mock(ResultSet.class); - - // when .next() is called, first time true, then false - when(resultSetMock.next()).thenReturn(true).thenReturn(true).thenReturn(false).thenReturn(true).thenReturn(true) - .thenReturn(false); - when(resultSetMock.getString("line")).thenReturn("The quick brown fox.") - .thenReturn("The lazy dog. Over a fence.").thenReturn("The quick brown fox.") - .thenReturn("The lazy dog. Over a fence."); - - UimaResultSetIterator iterator = new UimaResultSetIterator(resultSetMock, "line"); - - int cnt = 0; - while (iterator.hasNext()) { - String line = iterator.nextSentence(); - cnt++; - } - - assertEquals(3, cnt); - - iterator.reset(); - - cnt = 0; - while (iterator.hasNext()) { - String line = iterator.nextSentence(); - cnt++; - } - - assertEquals(3, cnt); - } - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/tokenization/tokenizer/preprocessor/StemmingPreprocessorTest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/tokenization/tokenizer/preprocessor/StemmingPreprocessorTest.java deleted file mode 100644 index 7f33a9823..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/tokenization/tokenizer/preprocessor/StemmingPreprocessorTest.java +++ /dev/null @@ -1,43 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.text.tokenization.tokenizer.preprocessor; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.nlp.uima.tokenization.tokenizer.preprocessor.StemmingPreprocessor; -import org.junit.Test; - -import static org.junit.Assert.assertEquals; - -/** - * @author raver119@gmail.com - */ -public class StemmingPreprocessorTest extends BaseDL4JTest { - - @Test - public void testPreProcess() throws Exception { - StemmingPreprocessor preprocessor = new StemmingPreprocessor(); - - String test = "TESTING."; - - String output = preprocessor.preProcess(test); - - System.out.println("Output: " + output); - assertEquals("test", output); - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/tokenization/tokenizerfactory/PosUimaTokenizerFactoryTest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/tokenization/tokenizerfactory/PosUimaTokenizerFactoryTest.java deleted file mode 100644 index db7e4caf8..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/tokenization/tokenizerfactory/PosUimaTokenizerFactoryTest.java +++ /dev/null @@ -1,68 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.text.tokenization.tokenizerfactory; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.nlp.uima.tokenization.tokenizerfactory.PosUimaTokenizerFactory; -import org.deeplearning4j.text.tokenization.tokenizer.Tokenizer; -import org.junit.Assert; -import org.junit.Before; -import org.junit.Test; - -import java.util.Arrays; -import java.util.List; - - -/** - * @author raver119@gmail.com - */ -public class PosUimaTokenizerFactoryTest extends BaseDL4JTest { - - @Before - public void setUp() throws Exception { - - } - - @Test - public void testCreate1() throws Exception { - String[] posTags = new String[] {"NN"}; - PosUimaTokenizerFactory factory = new PosUimaTokenizerFactory(Arrays.asList(posTags)); - Tokenizer tokenizer = factory.create("some test string"); - List tokens = tokenizer.getTokens(); - System.out.println("Tokens: " + tokens); - - Assert.assertEquals(3, tokens.size()); - Assert.assertEquals("NONE", tokens.get(0)); - Assert.assertEquals("test", tokens.get(1)); - Assert.assertEquals("string", tokens.get(2)); - } - - @Test - public void testCreate2() throws Exception { - String[] posTags = new String[] {"NN"}; - PosUimaTokenizerFactory factory = new PosUimaTokenizerFactory(Arrays.asList(posTags), true); - Tokenizer tokenizer = factory.create("some test string"); - List tokens = tokenizer.getTokens(); - System.out.println("Tokens: " + tokens); - - Assert.assertEquals(2, tokens.size()); - Assert.assertEquals("test", tokens.get(0)); - Assert.assertEquals("string", tokens.get(1)); - } -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/treeparser/TreeParserTest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/treeparser/TreeParserTest.java deleted file mode 100644 index f48cfc8f7..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/treeparser/TreeParserTest.java +++ /dev/null @@ -1,67 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.text.treeparser; - -import org.cleartk.syntax.constituent.type.TreebankNode; -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; -import org.deeplearning4j.nlp.uima.corpora.treeparser.TreeParser; -import org.junit.Before; -import org.junit.Test; - -import java.util.List; - -import static org.junit.Assert.assertEquals; - -/** - * Basic Tree parser tests - * @author Adam Gibson - */ -public class TreeParserTest extends BaseDL4JTest { - private TreeParser parser; - - @Before - public void init() throws Exception { - parser = new TreeParser(); - } - - - @Test - public void testNumTrees() throws Exception { - List trees = parser.getTrees("This is one sentence. This is another sentence."); - assertEquals(2, trees.size()); - - } - - - @Test - public void testHierarchy() throws Exception { - List trees = parser.getTrees("This is one sentence. This is another sentence."); - List treebankTrees = parser.getTreebankTrees("This is one sentence. This is another sentence."); - assertEquals(treebankTrees.size(), trees.size()); - - for (int i = 0; i < treebankTrees.size(); i++) { - Tree t = trees.get(i); - TreebankNode t2 = treebankTrees.get(i); - assertEquals(t.children().size(), t2.getChildren().size()); - } - - } - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/treeparser/TreeTransformerTests.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/treeparser/TreeTransformerTests.java deleted file mode 100644 index 831fff4ec..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/text/treeparser/TreeTransformerTests.java +++ /dev/null @@ -1,88 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.text.treeparser; - - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.nn.layers.feedforward.autoencoder.recursive.Tree; -import org.deeplearning4j.nlp.uima.corpora.treeparser.BinarizeTreeTransformer; -import org.deeplearning4j.nlp.uima.corpora.treeparser.CollapseUnaries; -import org.deeplearning4j.nlp.uima.corpora.treeparser.TreeParser; -import org.deeplearning4j.nlp.uima.corpora.treeparser.transformer.TreeTransformer; -import org.junit.Before; -import org.junit.Test; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.util.List; - -import static org.junit.Assert.assertEquals; - -/** - * Created by agibsonccc on 7/1/14. - */ -public class TreeTransformerTests extends BaseDL4JTest { - - private static final Logger log = LoggerFactory.getLogger(TreeTransformerTests.class); - private TreeParser parser; - - @Before - public void init() throws Exception { - parser = new TreeParser(); - } - - - - @Test - public void testBinarize() throws Exception { - List trees = parser.getTrees("Is so sad for my apl friend. i missed the new moon trailer."); - TreeTransformer t = new BinarizeTreeTransformer(); - TreeTransformer cnf = new CollapseUnaries(); - for (Tree t2 : trees) { - t2 = t.transform(t2); - assertChildSize(t2); - for (Tree child : t2.children()) - if (child.isLeaf()) - assertEquals("Found leaf node with parent that was not a preterminal", true, t2.isPreTerminal()); - t2 = cnf.transform(t2); - assertCollapsedUnaries(t2); - } - } - - - private void assertCollapsedUnaries(Tree tree) { - for (Tree child : tree.children()) - assertCollapsedUnaries(child); - if (tree.children().size() == 1 && !tree.isPreTerminal()) - throw new IllegalStateException("Trees with size of 1 and non preterminals should have been collapsed"); - } - - private void assertChildSize(Tree tree) { - for (Tree child : tree.children()) { - assertChildSize(child); - } - - assertEquals("Tree is not valid " + tree + " tree children size was " + tree.children().size(), true, - tree.isLeaf() || tree.isPreTerminal() || tree.children().size() <= 2); - - - } - - -} diff --git a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/util/ContextLabelTest.java b/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/util/ContextLabelTest.java deleted file mode 100644 index 1bef3fe3d..000000000 --- a/contrib/deeplearning4j-nlp-uima/src/test/java/org/deeplearning4j/util/ContextLabelTest.java +++ /dev/null @@ -1,65 +0,0 @@ -/* - * ****************************************************************************** - * * Copyright (c) 2021 Deeplearning4j Contributors - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -package org.deeplearning4j.util; - -import org.deeplearning4j.BaseDL4JTest; -import org.deeplearning4j.text.movingwindow.ContextLabelRetriever; -import org.deeplearning4j.text.tokenization.tokenizerfactory.TokenizerFactory; -import org.deeplearning4j.nlp.uima.tokenization.tokenizerfactory.UimaTokenizerFactory; -import org.junit.Before; -import org.junit.Test; -import org.nd4j.common.collection.MultiDimensionalMap; -import org.nd4j.common.primitives.Pair; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -import java.util.ArrayList; -import java.util.List; - -import static org.junit.Assert.assertEquals; - -/** - * Basic test case for the context label test - */ -public class ContextLabelTest extends BaseDL4JTest { - private static final Logger log = LoggerFactory.getLogger(ContextLabelTest.class); - private TokenizerFactory tokenizerFactory; - - @Before - public void init() throws Exception { - if (tokenizerFactory == null) { - tokenizerFactory = new UimaTokenizerFactory(false); - } - } - - @Test - public void testBasicLabel() { - String labeledSentence = " This sucks really bad ."; - Pair> ret = - ContextLabelRetriever.stringWithLabels(labeledSentence, tokenizerFactory); - //positive and none - assertEquals(2, ret.getSecond().size()); - List vals = new ArrayList<>(ret.getSecond().values()); - assertEquals(true, vals.contains("NEGATIVE")); - assertEquals(true, vals.contains("none")); - assertEquals("This sucks really bad .", ret.getFirst()); - } - - -} diff --git a/datavec/datavec-data/datavec-data-image/src/main/resources/META-INF/services/javax.imageio.spi.ImageReaderSpi b/datavec/datavec-data/datavec-data-image/src/main/resources/META-INF/services/javax.imageio.spi.ImageReaderSpi index fd9350845..6796fafc1 100644 --- a/datavec/datavec-data/datavec-data-image/src/main/resources/META-INF/services/javax.imageio.spi.ImageReaderSpi +++ b/datavec/datavec-data/datavec-data-image/src/main/resources/META-INF/services/javax.imageio.spi.ImageReaderSpi @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/datavec/datavec-data/datavec-data-image/src/main/resources/META-INF/services/javax.imageio.spi.ImageWriterSpi b/datavec/datavec-data/datavec-data-image/src/main/resources/META-INF/services/javax.imageio.spi.ImageWriterSpi index 26c5f2471..d61854537 100644 --- a/datavec/datavec-data/datavec-data-image/src/main/resources/META-INF/services/javax.imageio.spi.ImageWriterSpi +++ b/datavec/datavec-data/datavec-data-image/src/main/resources/META-INF/services/javax.imageio.spi.ImageWriterSpi @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/BaseCudnnHelper.java b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/BaseCudnnHelper.java index faea99273..f7ac730b4 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/BaseCudnnHelper.java +++ b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/BaseCudnnHelper.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda; diff --git a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/convolution/CudnnConvolutionHelper.java b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/convolution/CudnnConvolutionHelper.java index 91e3c4829..7aa9a62fe 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/convolution/CudnnConvolutionHelper.java +++ b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/convolution/CudnnConvolutionHelper.java @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.convolution; diff --git a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/convolution/subsampling/CudnnSubsamplingHelper.java b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/convolution/subsampling/CudnnSubsamplingHelper.java index c733022a7..b92810959 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/convolution/subsampling/CudnnSubsamplingHelper.java +++ b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/convolution/subsampling/CudnnSubsamplingHelper.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.convolution.subsampling; diff --git a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/dropout/CudnnDropoutHelper.java b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/dropout/CudnnDropoutHelper.java index 05bf4ca44..9b3414d95 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/dropout/CudnnDropoutHelper.java +++ b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/dropout/CudnnDropoutHelper.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.dropout; diff --git a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/normalization/CudnnBatchNormalizationHelper.java b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/normalization/CudnnBatchNormalizationHelper.java index 6c5664930..fea813aa0 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/normalization/CudnnBatchNormalizationHelper.java +++ b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/normalization/CudnnBatchNormalizationHelper.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.normalization; diff --git a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/normalization/CudnnLocalResponseNormalizationHelper.java b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/normalization/CudnnLocalResponseNormalizationHelper.java index fcc924928..e0257a3ec 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/normalization/CudnnLocalResponseNormalizationHelper.java +++ b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/normalization/CudnnLocalResponseNormalizationHelper.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.normalization; diff --git a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/recurrent/CudnnLSTMHelper.java b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/recurrent/CudnnLSTMHelper.java index 88e5a0d0c..120078d07 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/recurrent/CudnnLSTMHelper.java +++ b/deeplearning4j/deeplearning4j-cuda/src/main/java/org/deeplearning4j/cuda/recurrent/CudnnLSTMHelper.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.recurrent; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/CuDNNTestUtils.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/CuDNNTestUtils.java index a8d64b7b3..473449dbb 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/CuDNNTestUtils.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/CuDNNTestUtils.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/TestDataTypes.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/TestDataTypes.java index 851b75413..b0a60a76c 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/TestDataTypes.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/TestDataTypes.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/TestUtils.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/TestUtils.java index 0ac35996d..6954f57c1 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/TestUtils.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/TestUtils.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/ValidateCuDNN.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/ValidateCuDNN.java index b2093868b..dd2b341e7 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/ValidateCuDNN.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/ValidateCuDNN.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/convolution/ConvDataFormatTests.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/convolution/ConvDataFormatTests.java index fd46c1a8b..412f1b7ca 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/convolution/ConvDataFormatTests.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/convolution/ConvDataFormatTests.java @@ -1,18 +1,23 @@ -/* ****************************************************************************** - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.deeplearning4j.cuda.convolution; import lombok.*; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/convolution/TestConvolution.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/convolution/TestConvolution.java index 7af68ec94..c5c12c5b8 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/convolution/TestConvolution.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/convolution/TestConvolution.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.convolution; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/gradientcheck/CNNGradientCheckTest.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/gradientcheck/CNNGradientCheckTest.java index 41b69f2a2..cb8311be6 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/gradientcheck/CNNGradientCheckTest.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/gradientcheck/CNNGradientCheckTest.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.gradientcheck; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/gradientcheck/CuDNNGradientChecks.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/gradientcheck/CuDNNGradientChecks.java index 7a06fb627..3ed85ad14 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/gradientcheck/CuDNNGradientChecks.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/gradientcheck/CuDNNGradientChecks.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.gradientcheck; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/lstm/ValidateCudnnDropout.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/lstm/ValidateCudnnDropout.java index b59bd64d6..30aa80f0a 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/lstm/ValidateCudnnDropout.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/lstm/ValidateCudnnDropout.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.lstm; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/lstm/ValidateCudnnLSTM.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/lstm/ValidateCudnnLSTM.java index 636071a28..07af247f1 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/lstm/ValidateCudnnLSTM.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/lstm/ValidateCudnnLSTM.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.lstm; diff --git a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/util/CuDNNValidationUtil.java b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/util/CuDNNValidationUtil.java index 775c1fe9e..184b03cb2 100644 --- a/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/util/CuDNNValidationUtil.java +++ b/deeplearning4j/deeplearning4j-cuda/src/test/java/org/deeplearning4j/cuda/util/CuDNNValidationUtil.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.deeplearning4j.cuda.util; diff --git a/deeplearning4j/deeplearning4j-manifold/deeplearning4j-tsne/src/test/java/org/deeplearning4j/plot/TsneTest.java b/deeplearning4j/deeplearning4j-manifold/deeplearning4j-tsne/src/test/java/org/deeplearning4j/plot/TsneTest.java index faaa0943d..2b34d1b33 100644 --- a/deeplearning4j/deeplearning4j-manifold/deeplearning4j-tsne/src/test/java/org/deeplearning4j/plot/TsneTest.java +++ b/deeplearning4j/deeplearning4j-manifold/deeplearning4j-tsne/src/test/java/org/deeplearning4j/plot/TsneTest.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ //package org.deeplearning4j.plot; // diff --git a/deeplearning4j/deeplearning4j-ui-parent/deeplearning4j-vertx/src/main/resources/META-INF/services/org.deeplearning4j.ui.api.UIModule b/deeplearning4j/deeplearning4j-ui-parent/deeplearning4j-vertx/src/main/resources/META-INF/services/org.deeplearning4j.ui.api.UIModule index c208a5da2..ae919e65f 100644 --- a/deeplearning4j/deeplearning4j-ui-parent/deeplearning4j-vertx/src/main/resources/META-INF/services/org.deeplearning4j.ui.api.UIModule +++ b/deeplearning4j/deeplearning4j-ui-parent/deeplearning4j-vertx/src/main/resources/META-INF/services/org.deeplearning4j.ui.api.UIModule @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/libnd4j/include/array/ConstantHolder.h b/libnd4j/include/array/ConstantHolder.h index a404e5808..46709cfb7 100644 --- a/libnd4j/include/array/ConstantHolder.h +++ b/libnd4j/include/array/ConstantHolder.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/ConstantOffsetsBuffer.h b/libnd4j/include/array/ConstantOffsetsBuffer.h index 61c1e381f..51eda590e 100644 --- a/libnd4j/include/array/ConstantOffsetsBuffer.h +++ b/libnd4j/include/array/ConstantOffsetsBuffer.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/ConstantShapeBuffer.h b/libnd4j/include/array/ConstantShapeBuffer.h index 299653271..a3494ddaf 100644 --- a/libnd4j/include/array/ConstantShapeBuffer.h +++ b/libnd4j/include/array/ConstantShapeBuffer.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/CudaPointerDeallocator.h b/libnd4j/include/array/CudaPointerDeallocator.h index c5c817aeb..0163f49ea 100644 --- a/libnd4j/include/array/CudaPointerDeallocator.h +++ b/libnd4j/include/array/CudaPointerDeallocator.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/DataTypeUtils.h b/libnd4j/include/array/DataTypeUtils.h index 686b5bc97..caf43d7ba 100644 --- a/libnd4j/include/array/DataTypeUtils.h +++ b/libnd4j/include/array/DataTypeUtils.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/NDArray.hXX b/libnd4j/include/array/NDArray.hXX index e9c3bc104..6ec86a8da 100644 --- a/libnd4j/include/array/NDArray.hXX +++ b/libnd4j/include/array/NDArray.hXX @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // $NDArray.hpp - architech-independent implementations (both cuda and cpu). // diff --git a/libnd4j/include/array/NDArrayFactory.h b/libnd4j/include/array/NDArrayFactory.h index f778f3adb..f8cd5e4a6 100644 --- a/libnd4j/include/array/NDArrayFactory.h +++ b/libnd4j/include/array/NDArrayFactory.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver119 on 2018-09-16. diff --git a/libnd4j/include/array/PointerDeallocator.h b/libnd4j/include/array/PointerDeallocator.h index 5bf820421..5c72c82f5 100644 --- a/libnd4j/include/array/PointerDeallocator.h +++ b/libnd4j/include/array/PointerDeallocator.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/PointerWrapper.h b/libnd4j/include/array/PointerWrapper.h index 9e15aaaa3..98c0f03d2 100644 --- a/libnd4j/include/array/PointerWrapper.h +++ b/libnd4j/include/array/PointerWrapper.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/PrimaryPointerDeallocator.h b/libnd4j/include/array/PrimaryPointerDeallocator.h index b4fe34764..9ba9da60c 100644 --- a/libnd4j/include/array/PrimaryPointerDeallocator.h +++ b/libnd4j/include/array/PrimaryPointerDeallocator.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/ShapeDescriptor.h b/libnd4j/include/array/ShapeDescriptor.h index 0ce9de25a..5d0c4f249 100644 --- a/libnd4j/include/array/ShapeDescriptor.h +++ b/libnd4j/include/array/ShapeDescriptor.h @@ -1,11 +1,12 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * +/* ****************************************************************************** + * + * * This program and the accompanying materials are made available under the * terms of the Apache License, Version 2.0 which is available at * https://www.apache.org/licenses/LICENSE-2.0. * + * See the NOTICE file distributed with this work for additional + * information regarding copyright ownership. * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the diff --git a/libnd4j/include/array/cpu/NDArray.macro b/libnd4j/include/array/cpu/NDArray.macro index 5fbb56378..67d09125d 100644 --- a/libnd4j/include/array/cpu/NDArray.macro +++ b/libnd4j/include/array/cpu/NDArray.macro @@ -1,10 +1,13 @@ ################################################################################ -# Copyright (c) 2015-2018 Skymind, Inc. +# # # This program and the accompanying materials are made available under the # terms of the Apache License, Version 2.0 which is available at # https://www.apache.org/licenses/LICENSE-2.0. # +# See the NOTICE file distributed with this work for additional +# information regarding copyright ownership. + # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the diff --git a/libnd4j/include/array/cpu/NDArrayLambda.hpp b/libnd4j/include/array/cpu/NDArrayLambda.hpp index bd8742288..c3899628b 100644 --- a/libnd4j/include/array/cpu/NDArrayLambda.hpp +++ b/libnd4j/include/array/cpu/NDArrayLambda.hpp @@ -1,6 +1,22 @@ - - - +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ template void NDArray::applyTriplewiseLambda(NDArray& second, NDArray& third, const std::function& func, NDArray& target) { diff --git a/libnd4j/include/array/cuda/CudaPointerDeallocator.cu b/libnd4j/include/array/cuda/CudaPointerDeallocator.cu index 7367382ba..8e84595e0 100644 --- a/libnd4j/include/array/cuda/CudaPointerDeallocator.cu +++ b/libnd4j/include/array/cuda/CudaPointerDeallocator.cu @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/impl/ConstantOffsetsBuffer.cpp b/libnd4j/include/array/impl/ConstantOffsetsBuffer.cpp index 38b516a84..a4a3dba3e 100644 --- a/libnd4j/include/array/impl/ConstantOffsetsBuffer.cpp +++ b/libnd4j/include/array/impl/ConstantOffsetsBuffer.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/impl/ConstantShapeBuffer.cpp b/libnd4j/include/array/impl/ConstantShapeBuffer.cpp index 528101100..f27ce3da1 100644 --- a/libnd4j/include/array/impl/ConstantShapeBuffer.cpp +++ b/libnd4j/include/array/impl/ConstantShapeBuffer.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/impl/NDArrayFactory.cpp b/libnd4j/include/array/impl/NDArrayFactory.cpp index 8dcaff0fe..570d95c1c 100644 --- a/libnd4j/include/array/impl/NDArrayFactory.cpp +++ b/libnd4j/include/array/impl/NDArrayFactory.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by GS on 2018-12-20. diff --git a/libnd4j/include/array/impl/PointerDeallocator.cpp b/libnd4j/include/array/impl/PointerDeallocator.cpp index 2cd41cdda..caabd0427 100644 --- a/libnd4j/include/array/impl/PointerDeallocator.cpp +++ b/libnd4j/include/array/impl/PointerDeallocator.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/impl/PointerWrapper.cpp b/libnd4j/include/array/impl/PointerWrapper.cpp index b39cb54aa..e9bc0f31c 100644 --- a/libnd4j/include/array/impl/PointerWrapper.cpp +++ b/libnd4j/include/array/impl/PointerWrapper.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/array/impl/PrimaryPointerDeallocator.cpp b/libnd4j/include/array/impl/PrimaryPointerDeallocator.cpp index edd58d610..5c47bbea0 100644 --- a/libnd4j/include/array/impl/PrimaryPointerDeallocator.cpp +++ b/libnd4j/include/array/impl/PrimaryPointerDeallocator.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/execution/ExecutionMode.h b/libnd4j/include/execution/ExecutionMode.h index ea97e3fc9..deebdde25 100644 --- a/libnd4j/include/execution/ExecutionMode.h +++ b/libnd4j/include/execution/ExecutionMode.h @@ -1,10 +1,12 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K +/* ****************************************************************************** + * * * This program and the accompanying materials are made available under the * terms of the Apache License, Version 2.0 which is available at * https://www.apache.org/licenses/LICENSE-2.0. * + * See the NOTICE file distributed with this work for additional + * information regarding copyright ownership. * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the @@ -14,6 +16,7 @@ * SPDX-License-Identifier: Apache-2.0 ******************************************************************************/ + // // @author raver119@gmail.com // diff --git a/libnd4j/include/execution/Threads.h b/libnd4j/include/execution/Threads.h index bf35de089..be0ef8c2a 100644 --- a/libnd4j/include/execution/Threads.h +++ b/libnd4j/include/execution/Threads.h @@ -1,10 +1,12 @@ -/******************************************************************************* - * Copyright (c) 2019 Konduit +/* ****************************************************************************** + * * * This program and the accompanying materials are made available under the * terms of the Apache License, Version 2.0 which is available at * https://www.apache.org/licenses/LICENSE-2.0. * + * See the NOTICE file distributed with this work for additional + * information regarding copyright ownership. * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the diff --git a/libnd4j/include/execution/cuda/LaunchContext.cu b/libnd4j/include/execution/cuda/LaunchContext.cu index bd51c3504..1588f903e 100644 --- a/libnd4j/include/execution/cuda/LaunchContext.cu +++ b/libnd4j/include/execution/cuda/LaunchContext.cu @@ -1,11 +1,12 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2020 Konduit K.K. +/* ****************************************************************************** + * * * This program and the accompanying materials are made available under the * terms of the Apache License, Version 2.0 which is available at * https://www.apache.org/licenses/LICENSE-2.0. * + * See the NOTICE file distributed with this work for additional + * information regarding copyright ownership. * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the diff --git a/libnd4j/include/graph/Context.h b/libnd4j/include/graph/Context.h index de6608b46..30a83c668 100644 --- a/libnd4j/include/graph/Context.h +++ b/libnd4j/include/graph/Context.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/graph/ContextPrototype.h b/libnd4j/include/graph/ContextPrototype.h index e61831fa7..18d64af9d 100644 --- a/libnd4j/include/graph/ContextPrototype.h +++ b/libnd4j/include/graph/ContextPrototype.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/graph/generated/array_generated.h b/libnd4j/include/graph/generated/array_generated.h deleted file mode 100644 index 5c4c0d7af..000000000 --- a/libnd4j/include/graph/generated/array_generated.h +++ /dev/null @@ -1,280 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_ARRAY_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_ARRAY_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -namespace sd { -namespace graph { - -struct FlatArray; - -enum ByteOrder { - ByteOrder_LE = 0, - ByteOrder_BE = 1, - ByteOrder_MIN = ByteOrder_LE, - ByteOrder_MAX = ByteOrder_BE -}; - -inline const ByteOrder (&EnumValuesByteOrder())[2] { - static const ByteOrder values[] = { - ByteOrder_LE, - ByteOrder_BE - }; - return values; -} - -inline const char * const *EnumNamesByteOrder() { - static const char * const names[] = { - "LE", - "BE", - nullptr - }; - return names; -} - -inline const char *EnumNameByteOrder(ByteOrder e) { - const size_t index = static_cast(e); - return EnumNamesByteOrder()[index]; -} - -enum DType { - DType_INHERIT = 0, - DType_BOOL = 1, - DType_FLOAT8 = 2, - DType_HALF = 3, - DType_HALF2 = 4, - DType_FLOAT = 5, - DType_DOUBLE = 6, - DType_INT8 = 7, - DType_INT16 = 8, - DType_INT32 = 9, - DType_INT64 = 10, - DType_UINT8 = 11, - DType_UINT16 = 12, - DType_UINT32 = 13, - DType_UINT64 = 14, - DType_QINT8 = 15, - DType_QINT16 = 16, - DType_BFLOAT16 = 17, - DType_UTF8 = 50, - DType_UTF16 = 51, - DType_UTF32 = 52, - DType_MIN = DType_INHERIT, - DType_MAX = DType_UTF32 -}; - -inline const DType (&EnumValuesDType())[21] { - static const DType values[] = { - DType_INHERIT, - DType_BOOL, - DType_FLOAT8, - DType_HALF, - DType_HALF2, - DType_FLOAT, - DType_DOUBLE, - DType_INT8, - DType_INT16, - DType_INT32, - DType_INT64, - DType_UINT8, - DType_UINT16, - DType_UINT32, - DType_UINT64, - DType_QINT8, - DType_QINT16, - DType_BFLOAT16, - DType_UTF8, - DType_UTF16, - DType_UTF32 - }; - return values; -} - -inline const char * const *EnumNamesDType() { - static const char * const names[] = { - "INHERIT", - "BOOL", - "FLOAT8", - "HALF", - "HALF2", - "FLOAT", - "DOUBLE", - "INT8", - "INT16", - "INT32", - "INT64", - "UINT8", - "UINT16", - "UINT32", - "UINT64", - "QINT8", - "QINT16", - "BFLOAT16", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "UTF8", - "UTF16", - "UTF32", - nullptr - }; - return names; -} - -inline const char *EnumNameDType(DType e) { - const size_t index = static_cast(e); - return EnumNamesDType()[index]; -} - -struct FlatArray FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_SHAPE = 4, - VT_BUFFER = 6, - VT_DTYPE = 8, - VT_BYTEORDER = 10 - }; - const flatbuffers::Vector *shape() const { - return GetPointer *>(VT_SHAPE); - } - const flatbuffers::Vector *buffer() const { - return GetPointer *>(VT_BUFFER); - } - DType dtype() const { - return static_cast(GetField(VT_DTYPE, 0)); - } - ByteOrder byteOrder() const { - return static_cast(GetField(VT_BYTEORDER, 0)); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_SHAPE) && - verifier.VerifyVector(shape()) && - VerifyOffset(verifier, VT_BUFFER) && - verifier.VerifyVector(buffer()) && - VerifyField(verifier, VT_DTYPE) && - VerifyField(verifier, VT_BYTEORDER) && - verifier.EndTable(); - } -}; - -struct FlatArrayBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_shape(flatbuffers::Offset> shape) { - fbb_.AddOffset(FlatArray::VT_SHAPE, shape); - } - void add_buffer(flatbuffers::Offset> buffer) { - fbb_.AddOffset(FlatArray::VT_BUFFER, buffer); - } - void add_dtype(DType dtype) { - fbb_.AddElement(FlatArray::VT_DTYPE, static_cast(dtype), 0); - } - void add_byteOrder(ByteOrder byteOrder) { - fbb_.AddElement(FlatArray::VT_BYTEORDER, static_cast(byteOrder), 0); - } - explicit FlatArrayBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatArrayBuilder &operator=(const FlatArrayBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatArray( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset> shape = 0, - flatbuffers::Offset> buffer = 0, - DType dtype = DType_INHERIT, - ByteOrder byteOrder = ByteOrder_LE) { - FlatArrayBuilder builder_(_fbb); - builder_.add_buffer(buffer); - builder_.add_shape(shape); - builder_.add_byteOrder(byteOrder); - builder_.add_dtype(dtype); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatArrayDirect( - flatbuffers::FlatBufferBuilder &_fbb, - const std::vector *shape = nullptr, - const std::vector *buffer = nullptr, - DType dtype = DType_INHERIT, - ByteOrder byteOrder = ByteOrder_LE) { - return sd::graph::CreateFlatArray( - _fbb, - shape ? _fbb.CreateVector(*shape) : 0, - buffer ? _fbb.CreateVector(*buffer) : 0, - dtype, - byteOrder); -} - -inline const sd::graph::FlatArray *GetFlatArray(const void *buf) { - return flatbuffers::GetRoot(buf); -} - -inline const sd::graph::FlatArray *GetSizePrefixedFlatArray(const void *buf) { - return flatbuffers::GetSizePrefixedRoot(buf); -} - -inline bool VerifyFlatArrayBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifyBuffer(nullptr); -} - -inline bool VerifySizePrefixedFlatArrayBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifySizePrefixedBuffer(nullptr); -} - -inline void FinishFlatArrayBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.Finish(root); -} - -inline void FinishSizePrefixedFlatArrayBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.FinishSizePrefixed(root); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_ARRAY_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/array_generated.js b/libnd4j/include/graph/generated/array_generated.js deleted file mode 100644 index 758b0c092..000000000 --- a/libnd4j/include/graph/generated/array_generated.js +++ /dev/null @@ -1,260 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @enum - */ -nd4j.graph.ByteOrder = { - LE: 0, - BE: 1 -}; - -/** - * @enum - */ -nd4j.graph.DType = { - INHERIT: 0, - BOOL: 1, - FLOAT8: 2, - HALF: 3, - HALF2: 4, - FLOAT: 5, - DOUBLE: 6, - INT8: 7, - INT16: 8, - INT32: 9, - INT64: 10, - UINT8: 11, - UINT16: 12, - UINT32: 13, - UINT64: 14, - QINT8: 15, - QINT16: 16, - BFLOAT16: 17, - UTF8: 50, - UTF16: 51, - UTF32: 52 -}; - -/** - * @constructor - */ -nd4j.graph.FlatArray = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatArray} - */ -nd4j.graph.FlatArray.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray} - */ -nd4j.graph.FlatArray.getRootAsFlatArray = function(bb, obj) { - return (obj || new nd4j.graph.FlatArray).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {number} index - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatArray.prototype.shape = function(index) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb.__vector(this.bb_pos + offset) + index * 8) : this.bb.createLong(0, 0); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatArray.prototype.shapeLength = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @returns {number} - */ -nd4j.graph.FlatArray.prototype.buffer = function(index) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readInt8(this.bb.__vector(this.bb_pos + offset) + index) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatArray.prototype.bufferLength = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int8Array} - */ -nd4j.graph.FlatArray.prototype.bufferArray = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? new Int8Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @returns {nd4j.graph.DType} - */ -nd4j.graph.FlatArray.prototype.dtype = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? /** @type {nd4j.graph.DType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.DType.INHERIT; -}; - -/** - * @returns {nd4j.graph.ByteOrder} - */ -nd4j.graph.FlatArray.prototype.byteOrder = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? /** @type {nd4j.graph.ByteOrder} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.ByteOrder.LE; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatArray.startFlatArray = function(builder) { - builder.startObject(4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} shapeOffset - */ -nd4j.graph.FlatArray.addShape = function(builder, shapeOffset) { - builder.addFieldOffset(0, shapeOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatArray.createShapeVector = function(builder, data) { - builder.startVector(8, data.length, 8); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt64(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatArray.startShapeVector = function(builder, numElems) { - builder.startVector(8, numElems, 8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} bufferOffset - */ -nd4j.graph.FlatArray.addBuffer = function(builder, bufferOffset) { - builder.addFieldOffset(1, bufferOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatArray.createBufferVector = function(builder, data) { - builder.startVector(1, data.length, 1); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt8(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatArray.startBufferVector = function(builder, numElems) { - builder.startVector(1, numElems, 1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.DType} dtype - */ -nd4j.graph.FlatArray.addDtype = function(builder, dtype) { - builder.addFieldInt8(2, dtype, nd4j.graph.DType.INHERIT); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.ByteOrder} byteOrder - */ -nd4j.graph.FlatArray.addByteOrder = function(builder, byteOrder) { - builder.addFieldInt8(3, byteOrder, nd4j.graph.ByteOrder.LE); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatArray.endFlatArray = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} offset - */ -nd4j.graph.FlatArray.finishFlatArrayBuffer = function(builder, offset) { - builder.finish(offset); -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/config_generated.h b/libnd4j/include/graph/generated/config_generated.h deleted file mode 100644 index 2c12027a2..000000000 --- a/libnd4j/include/graph/generated/config_generated.h +++ /dev/null @@ -1,294 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_CONFIG_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_CONFIG_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -namespace sd { -namespace graph { - -struct FlatConfiguration; - -enum ProfilingMode { - ProfilingMode_NONE = 0, - ProfilingMode_NAN_PANIC = 1, - ProfilingMode_INF_PANIC = 2, - ProfilingMode_ANY_PANIC = 3, - ProfilingMode_MIN = ProfilingMode_NONE, - ProfilingMode_MAX = ProfilingMode_ANY_PANIC -}; - -inline const ProfilingMode (&EnumValuesProfilingMode())[4] { - static const ProfilingMode values[] = { - ProfilingMode_NONE, - ProfilingMode_NAN_PANIC, - ProfilingMode_INF_PANIC, - ProfilingMode_ANY_PANIC - }; - return values; -} - -inline const char * const *EnumNamesProfilingMode() { - static const char * const names[] = { - "NONE", - "NAN_PANIC", - "INF_PANIC", - "ANY_PANIC", - nullptr - }; - return names; -} - -inline const char *EnumNameProfilingMode(ProfilingMode e) { - const size_t index = static_cast(e); - return EnumNamesProfilingMode()[index]; -} - -enum ExecutionMode { - ExecutionMode_SEQUENTIAL = 0, - ExecutionMode_STRICT = 1, - ExecutionMode_AUTO = 2, - ExecutionMode_MIN = ExecutionMode_SEQUENTIAL, - ExecutionMode_MAX = ExecutionMode_AUTO -}; - -inline const ExecutionMode (&EnumValuesExecutionMode())[3] { - static const ExecutionMode values[] = { - ExecutionMode_SEQUENTIAL, - ExecutionMode_STRICT, - ExecutionMode_AUTO - }; - return values; -} - -inline const char * const *EnumNamesExecutionMode() { - static const char * const names[] = { - "SEQUENTIAL", - "STRICT", - "AUTO", - nullptr - }; - return names; -} - -inline const char *EnumNameExecutionMode(ExecutionMode e) { - const size_t index = static_cast(e); - return EnumNamesExecutionMode()[index]; -} - -enum OutputMode { - OutputMode_IMPLICIT = 0, - OutputMode_EXPLICIT = 1, - OutputMode_EXPLICIT_AND_IMPLICIT = 2, - OutputMode_VARIABLE_SPACE = 3, - OutputMode_OPTIMIZED = 4, - OutputMode_MIN = OutputMode_IMPLICIT, - OutputMode_MAX = OutputMode_OPTIMIZED -}; - -inline const OutputMode (&EnumValuesOutputMode())[5] { - static const OutputMode values[] = { - OutputMode_IMPLICIT, - OutputMode_EXPLICIT, - OutputMode_EXPLICIT_AND_IMPLICIT, - OutputMode_VARIABLE_SPACE, - OutputMode_OPTIMIZED - }; - return values; -} - -inline const char * const *EnumNamesOutputMode() { - static const char * const names[] = { - "IMPLICIT", - "EXPLICIT", - "EXPLICIT_AND_IMPLICIT", - "VARIABLE_SPACE", - "OPTIMIZED", - nullptr - }; - return names; -} - -inline const char *EnumNameOutputMode(OutputMode e) { - const size_t index = static_cast(e); - return EnumNamesOutputMode()[index]; -} - -enum Direction { - Direction_FORWARD_ONLY = 0, - Direction_FORWARD_AND_BACKWARD = 1, - Direction_BACKWARD_ONLY = 2, - Direction_MIN = Direction_FORWARD_ONLY, - Direction_MAX = Direction_BACKWARD_ONLY -}; - -inline const Direction (&EnumValuesDirection())[3] { - static const Direction values[] = { - Direction_FORWARD_ONLY, - Direction_FORWARD_AND_BACKWARD, - Direction_BACKWARD_ONLY - }; - return values; -} - -inline const char * const *EnumNamesDirection() { - static const char * const names[] = { - "FORWARD_ONLY", - "FORWARD_AND_BACKWARD", - "BACKWARD_ONLY", - nullptr - }; - return names; -} - -inline const char *EnumNameDirection(Direction e) { - const size_t index = static_cast(e); - return EnumNamesDirection()[index]; -} - -struct FlatConfiguration FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4, - VT_EXECUTIONMODE = 6, - VT_PROFILINGMODE = 8, - VT_OUTPUTMODE = 10, - VT_TIMESTATS = 12, - VT_FOOTPRINTFORWARD = 14, - VT_FOOTPRINTBACKWARD = 16, - VT_DIRECTION = 18 - }; - int64_t id() const { - return GetField(VT_ID, 0); - } - ExecutionMode executionMode() const { - return static_cast(GetField(VT_EXECUTIONMODE, 0)); - } - ProfilingMode profilingMode() const { - return static_cast(GetField(VT_PROFILINGMODE, 0)); - } - OutputMode outputMode() const { - return static_cast(GetField(VT_OUTPUTMODE, 0)); - } - bool timestats() const { - return GetField(VT_TIMESTATS, 0) != 0; - } - int64_t footprintForward() const { - return GetField(VT_FOOTPRINTFORWARD, 0); - } - int64_t footprintBackward() const { - return GetField(VT_FOOTPRINTBACKWARD, 0); - } - Direction direction() const { - return static_cast(GetField(VT_DIRECTION, 0)); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_ID) && - VerifyField(verifier, VT_EXECUTIONMODE) && - VerifyField(verifier, VT_PROFILINGMODE) && - VerifyField(verifier, VT_OUTPUTMODE) && - VerifyField(verifier, VT_TIMESTATS) && - VerifyField(verifier, VT_FOOTPRINTFORWARD) && - VerifyField(verifier, VT_FOOTPRINTBACKWARD) && - VerifyField(verifier, VT_DIRECTION) && - verifier.EndTable(); - } -}; - -struct FlatConfigurationBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(int64_t id) { - fbb_.AddElement(FlatConfiguration::VT_ID, id, 0); - } - void add_executionMode(ExecutionMode executionMode) { - fbb_.AddElement(FlatConfiguration::VT_EXECUTIONMODE, static_cast(executionMode), 0); - } - void add_profilingMode(ProfilingMode profilingMode) { - fbb_.AddElement(FlatConfiguration::VT_PROFILINGMODE, static_cast(profilingMode), 0); - } - void add_outputMode(OutputMode outputMode) { - fbb_.AddElement(FlatConfiguration::VT_OUTPUTMODE, static_cast(outputMode), 0); - } - void add_timestats(bool timestats) { - fbb_.AddElement(FlatConfiguration::VT_TIMESTATS, static_cast(timestats), 0); - } - void add_footprintForward(int64_t footprintForward) { - fbb_.AddElement(FlatConfiguration::VT_FOOTPRINTFORWARD, footprintForward, 0); - } - void add_footprintBackward(int64_t footprintBackward) { - fbb_.AddElement(FlatConfiguration::VT_FOOTPRINTBACKWARD, footprintBackward, 0); - } - void add_direction(Direction direction) { - fbb_.AddElement(FlatConfiguration::VT_DIRECTION, static_cast(direction), 0); - } - explicit FlatConfigurationBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatConfigurationBuilder &operator=(const FlatConfigurationBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatConfiguration( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t id = 0, - ExecutionMode executionMode = ExecutionMode_SEQUENTIAL, - ProfilingMode profilingMode = ProfilingMode_NONE, - OutputMode outputMode = OutputMode_IMPLICIT, - bool timestats = false, - int64_t footprintForward = 0, - int64_t footprintBackward = 0, - Direction direction = Direction_FORWARD_ONLY) { - FlatConfigurationBuilder builder_(_fbb); - builder_.add_footprintBackward(footprintBackward); - builder_.add_footprintForward(footprintForward); - builder_.add_id(id); - builder_.add_direction(direction); - builder_.add_timestats(timestats); - builder_.add_outputMode(outputMode); - builder_.add_profilingMode(profilingMode); - builder_.add_executionMode(executionMode); - return builder_.Finish(); -} - -inline const sd::graph::FlatConfiguration *GetFlatConfiguration(const void *buf) { - return flatbuffers::GetRoot(buf); -} - -inline const sd::graph::FlatConfiguration *GetSizePrefixedFlatConfiguration(const void *buf) { - return flatbuffers::GetSizePrefixedRoot(buf); -} - -inline bool VerifyFlatConfigurationBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifyBuffer(nullptr); -} - -inline bool VerifySizePrefixedFlatConfigurationBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifySizePrefixedBuffer(nullptr); -} - -inline void FinishFlatConfigurationBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.Finish(root); -} - -inline void FinishSizePrefixedFlatConfigurationBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.FinishSizePrefixed(root); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_CONFIG_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/config_generated.js b/libnd4j/include/graph/generated/config_generated.js deleted file mode 100644 index 093851187..000000000 --- a/libnd4j/include/graph/generated/config_generated.js +++ /dev/null @@ -1,260 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @enum - */ -nd4j.graph.ProfilingMode = { - NONE: 0, - NAN_PANIC: 1, - INF_PANIC: 2, - ANY_PANIC: 3 -}; - -/** - * @enum - */ -nd4j.graph.ExecutionMode = { - SEQUENTIAL: 0, - STRICT: 1, - AUTO: 2 -}; - -/** - * @enum - */ -nd4j.graph.OutputMode = { - IMPLICIT: 0, - EXPLICIT: 1, - EXPLICIT_AND_IMPLICIT: 2, - VARIABLE_SPACE: 3, - OPTIMIZED: 4 -}; - -/** - * @enum - */ -nd4j.graph.Direction = { - FORWARD_ONLY: 0, - FORWARD_AND_BACKWARD: 1, - BACKWARD_ONLY: 2 -}; - -/** - * @constructor - */ -nd4j.graph.FlatConfiguration = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatConfiguration} - */ -nd4j.graph.FlatConfiguration.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatConfiguration=} obj - * @returns {nd4j.graph.FlatConfiguration} - */ -nd4j.graph.FlatConfiguration.getRootAsFlatConfiguration = function(bb, obj) { - return (obj || new nd4j.graph.FlatConfiguration).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatConfiguration.prototype.id = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {nd4j.graph.ExecutionMode} - */ -nd4j.graph.FlatConfiguration.prototype.executionMode = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? /** @type {nd4j.graph.ExecutionMode} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.ExecutionMode.SEQUENTIAL; -}; - -/** - * @returns {nd4j.graph.ProfilingMode} - */ -nd4j.graph.FlatConfiguration.prototype.profilingMode = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? /** @type {nd4j.graph.ProfilingMode} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.ProfilingMode.NONE; -}; - -/** - * @returns {nd4j.graph.OutputMode} - */ -nd4j.graph.FlatConfiguration.prototype.outputMode = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? /** @type {nd4j.graph.OutputMode} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.OutputMode.IMPLICIT; -}; - -/** - * @returns {boolean} - */ -nd4j.graph.FlatConfiguration.prototype.timestats = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? !!this.bb.readInt8(this.bb_pos + offset) : false; -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatConfiguration.prototype.footprintForward = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatConfiguration.prototype.footprintBackward = function() { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {nd4j.graph.Direction} - */ -nd4j.graph.FlatConfiguration.prototype.direction = function() { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? /** @type {nd4j.graph.Direction} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.Direction.FORWARD_ONLY; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatConfiguration.startFlatConfiguration = function(builder) { - builder.startObject(8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} id - */ -nd4j.graph.FlatConfiguration.addId = function(builder, id) { - builder.addFieldInt64(0, id, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.ExecutionMode} executionMode - */ -nd4j.graph.FlatConfiguration.addExecutionMode = function(builder, executionMode) { - builder.addFieldInt8(1, executionMode, nd4j.graph.ExecutionMode.SEQUENTIAL); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.ProfilingMode} profilingMode - */ -nd4j.graph.FlatConfiguration.addProfilingMode = function(builder, profilingMode) { - builder.addFieldInt8(2, profilingMode, nd4j.graph.ProfilingMode.NONE); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.OutputMode} outputMode - */ -nd4j.graph.FlatConfiguration.addOutputMode = function(builder, outputMode) { - builder.addFieldInt8(3, outputMode, nd4j.graph.OutputMode.IMPLICIT); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {boolean} timestats - */ -nd4j.graph.FlatConfiguration.addTimestats = function(builder, timestats) { - builder.addFieldInt8(4, +timestats, +false); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} footprintForward - */ -nd4j.graph.FlatConfiguration.addFootprintForward = function(builder, footprintForward) { - builder.addFieldInt64(5, footprintForward, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} footprintBackward - */ -nd4j.graph.FlatConfiguration.addFootprintBackward = function(builder, footprintBackward) { - builder.addFieldInt64(6, footprintBackward, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.Direction} direction - */ -nd4j.graph.FlatConfiguration.addDirection = function(builder, direction) { - builder.addFieldInt8(7, direction, nd4j.graph.Direction.FORWARD_ONLY); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatConfiguration.endFlatConfiguration = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} offset - */ -nd4j.graph.FlatConfiguration.finishFlatConfigurationBuffer = function(builder, offset) { - builder.finish(offset); -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/graph.grpc.fb.cc b/libnd4j/include/graph/generated/graph.grpc.fb.cc deleted file mode 100644 index 5146f49f2..000000000 --- a/libnd4j/include/graph/generated/graph.grpc.fb.cc +++ /dev/null @@ -1,143 +0,0 @@ -// Generated by the gRPC C++ plugin. -// If you make any local change, they will be lost. -// source: graph - -#include "graph_generated.h" -#include "graph.grpc.fb.h" - -#include -#include -#include -#include -#include -#include -#include -#include -namespace sd { -namespace graph { - -static const char* GraphInferenceServer_method_names[] = { - "/nd4j.graph.GraphInferenceServer/RegisterGraph", - "/nd4j.graph.GraphInferenceServer/ForgetGraph", - "/nd4j.graph.GraphInferenceServer/ReplaceGraph", - "/nd4j.graph.GraphInferenceServer/InferenceRequest", -}; - -std::unique_ptr< GraphInferenceServer::Stub> GraphInferenceServer::NewStub(const std::shared_ptr< ::grpc::ChannelInterface>& channel, const ::grpc::StubOptions& options) { - std::unique_ptr< GraphInferenceServer::Stub> stub(new GraphInferenceServer::Stub(channel)); - return stub; -} - -GraphInferenceServer::Stub::Stub(const std::shared_ptr< ::grpc::ChannelInterface>& channel) - : channel_(channel) , rpcmethod_RegisterGraph_(GraphInferenceServer_method_names[0], ::grpc::internal::RpcMethod::NORMAL_RPC, channel) - , rpcmethod_ForgetGraph_(GraphInferenceServer_method_names[1], ::grpc::internal::RpcMethod::NORMAL_RPC, channel) - , rpcmethod_ReplaceGraph_(GraphInferenceServer_method_names[2], ::grpc::internal::RpcMethod::NORMAL_RPC, channel) - , rpcmethod_InferenceRequest_(GraphInferenceServer_method_names[3], ::grpc::internal::RpcMethod::NORMAL_RPC, channel) - {} - -::grpc::Status GraphInferenceServer::Stub::RegisterGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) { - return ::grpc::internal::BlockingUnaryCall(channel_.get(), rpcmethod_RegisterGraph_, context, request, response); -} - -::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* GraphInferenceServer::Stub::AsyncRegisterGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return ::grpc::internal::ClientAsyncResponseReaderFactory< flatbuffers::grpc::Message>::Create(channel_.get(), cq, rpcmethod_RegisterGraph_, context, request, true); -} - -::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* GraphInferenceServer::Stub::PrepareAsyncRegisterGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return ::grpc::internal::ClientAsyncResponseReaderFactory< flatbuffers::grpc::Message>::Create(channel_.get(), cq, rpcmethod_RegisterGraph_, context, request, false); -} - -::grpc::Status GraphInferenceServer::Stub::ForgetGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) { - return ::grpc::internal::BlockingUnaryCall(channel_.get(), rpcmethod_ForgetGraph_, context, request, response); -} - -::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* GraphInferenceServer::Stub::AsyncForgetGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return ::grpc::internal::ClientAsyncResponseReaderFactory< flatbuffers::grpc::Message>::Create(channel_.get(), cq, rpcmethod_ForgetGraph_, context, request, true); -} - -::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* GraphInferenceServer::Stub::PrepareAsyncForgetGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return ::grpc::internal::ClientAsyncResponseReaderFactory< flatbuffers::grpc::Message>::Create(channel_.get(), cq, rpcmethod_ForgetGraph_, context, request, false); -} - -::grpc::Status GraphInferenceServer::Stub::ReplaceGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) { - return ::grpc::internal::BlockingUnaryCall(channel_.get(), rpcmethod_ReplaceGraph_, context, request, response); -} - -::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* GraphInferenceServer::Stub::AsyncReplaceGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return ::grpc::internal::ClientAsyncResponseReaderFactory< flatbuffers::grpc::Message>::Create(channel_.get(), cq, rpcmethod_ReplaceGraph_, context, request, true); -} - -::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* GraphInferenceServer::Stub::PrepareAsyncReplaceGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return ::grpc::internal::ClientAsyncResponseReaderFactory< flatbuffers::grpc::Message>::Create(channel_.get(), cq, rpcmethod_ReplaceGraph_, context, request, false); -} - -::grpc::Status GraphInferenceServer::Stub::InferenceRequest(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) { - return ::grpc::internal::BlockingUnaryCall(channel_.get(), rpcmethod_InferenceRequest_, context, request, response); -} - -::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* GraphInferenceServer::Stub::AsyncInferenceRequestRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return ::grpc::internal::ClientAsyncResponseReaderFactory< flatbuffers::grpc::Message>::Create(channel_.get(), cq, rpcmethod_InferenceRequest_, context, request, true); -} - -::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* GraphInferenceServer::Stub::PrepareAsyncInferenceRequestRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return ::grpc::internal::ClientAsyncResponseReaderFactory< flatbuffers::grpc::Message>::Create(channel_.get(), cq, rpcmethod_InferenceRequest_, context, request, false); -} - -GraphInferenceServer::Service::Service() { - AddMethod(new ::grpc::internal::RpcServiceMethod( - GraphInferenceServer_method_names[0], - ::grpc::internal::RpcMethod::NORMAL_RPC, - new ::grpc::internal::RpcMethodHandler< GraphInferenceServer::Service, flatbuffers::grpc::Message, flatbuffers::grpc::Message>( - std::mem_fn(&GraphInferenceServer::Service::RegisterGraph), this))); - AddMethod(new ::grpc::internal::RpcServiceMethod( - GraphInferenceServer_method_names[1], - ::grpc::internal::RpcMethod::NORMAL_RPC, - new ::grpc::internal::RpcMethodHandler< GraphInferenceServer::Service, flatbuffers::grpc::Message, flatbuffers::grpc::Message>( - std::mem_fn(&GraphInferenceServer::Service::ForgetGraph), this))); - AddMethod(new ::grpc::internal::RpcServiceMethod( - GraphInferenceServer_method_names[2], - ::grpc::internal::RpcMethod::NORMAL_RPC, - new ::grpc::internal::RpcMethodHandler< GraphInferenceServer::Service, flatbuffers::grpc::Message, flatbuffers::grpc::Message>( - std::mem_fn(&GraphInferenceServer::Service::ReplaceGraph), this))); - AddMethod(new ::grpc::internal::RpcServiceMethod( - GraphInferenceServer_method_names[3], - ::grpc::internal::RpcMethod::NORMAL_RPC, - new ::grpc::internal::RpcMethodHandler< GraphInferenceServer::Service, flatbuffers::grpc::Message, flatbuffers::grpc::Message>( - std::mem_fn(&GraphInferenceServer::Service::InferenceRequest), this))); -} - -GraphInferenceServer::Service::~Service() { -} - -::grpc::Status GraphInferenceServer::Service::RegisterGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) { - (void) context; - (void) request; - (void) response; - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); -} - -::grpc::Status GraphInferenceServer::Service::ForgetGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) { - (void) context; - (void) request; - (void) response; - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); -} - -::grpc::Status GraphInferenceServer::Service::ReplaceGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) { - (void) context; - (void) request; - (void) response; - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); -} - -::grpc::Status GraphInferenceServer::Service::InferenceRequest(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) { - (void) context; - (void) request; - (void) response; - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); -} - - -} // namespace sd -} // namespace graph - diff --git a/libnd4j/include/graph/generated/graph.grpc.fb.h b/libnd4j/include/graph/generated/graph.grpc.fb.h deleted file mode 100644 index 0167f48d5..000000000 --- a/libnd4j/include/graph/generated/graph.grpc.fb.h +++ /dev/null @@ -1,372 +0,0 @@ -// Generated by the gRPC C++ plugin. -// If you make any local change, they will be lost. -// source: graph -#ifndef GRPC_graph__INCLUDED -#define GRPC_graph__INCLUDED - -#include "graph_generated.h" -#include "flatbuffers/grpc.h" - -#include -#include -#include -#include -#include -#include -#include -#include -#include - -namespace grpc { -class CompletionQueue; -class Channel; -class ServerCompletionQueue; -class ServerContext; -} // namespace grpc - -namespace sd { -namespace graph { - -class GraphInferenceServer final { - public: - static constexpr char const* service_full_name() { - return "nd4j.graph.GraphInferenceServer"; - } - class StubInterface { - public: - virtual ~StubInterface() {} - virtual ::grpc::Status RegisterGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) = 0; - std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>> AsyncRegisterGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>>(AsyncRegisterGraphRaw(context, request, cq)); - } - std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>> PrepareAsyncRegisterGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>>(PrepareAsyncRegisterGraphRaw(context, request, cq)); - } - virtual ::grpc::Status ForgetGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) = 0; - std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>> AsyncForgetGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>>(AsyncForgetGraphRaw(context, request, cq)); - } - std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>> PrepareAsyncForgetGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>>(PrepareAsyncForgetGraphRaw(context, request, cq)); - } - virtual ::grpc::Status ReplaceGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) = 0; - std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>> AsyncReplaceGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>>(AsyncReplaceGraphRaw(context, request, cq)); - } - std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>> PrepareAsyncReplaceGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>>(PrepareAsyncReplaceGraphRaw(context, request, cq)); - } - virtual ::grpc::Status InferenceRequest(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) = 0; - std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>> AsyncInferenceRequest(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>>(AsyncInferenceRequestRaw(context, request, cq)); - } - std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>> PrepareAsyncInferenceRequest(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>>(PrepareAsyncInferenceRequestRaw(context, request, cq)); - } - private: - virtual ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>* AsyncRegisterGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) = 0; - virtual ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>* PrepareAsyncRegisterGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) = 0; - virtual ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>* AsyncForgetGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) = 0; - virtual ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>* PrepareAsyncForgetGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) = 0; - virtual ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>* AsyncReplaceGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) = 0; - virtual ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>* PrepareAsyncReplaceGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) = 0; - virtual ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>* AsyncInferenceRequestRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) = 0; - virtual ::grpc::ClientAsyncResponseReaderInterface< flatbuffers::grpc::Message>* PrepareAsyncInferenceRequestRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) = 0; - }; - class Stub final : public StubInterface { - public: - Stub(const std::shared_ptr< ::grpc::ChannelInterface>& channel); - ::grpc::Status RegisterGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) override; - std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>> AsyncRegisterGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>>(AsyncRegisterGraphRaw(context, request, cq)); - } - std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>> PrepareAsyncRegisterGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>>(PrepareAsyncRegisterGraphRaw(context, request, cq)); - } - ::grpc::Status ForgetGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) override; - std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>> AsyncForgetGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>>(AsyncForgetGraphRaw(context, request, cq)); - } - std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>> PrepareAsyncForgetGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>>(PrepareAsyncForgetGraphRaw(context, request, cq)); - } - ::grpc::Status ReplaceGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) override; - std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>> AsyncReplaceGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>>(AsyncReplaceGraphRaw(context, request, cq)); - } - std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>> PrepareAsyncReplaceGraph(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>>(PrepareAsyncReplaceGraphRaw(context, request, cq)); - } - ::grpc::Status InferenceRequest(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, flatbuffers::grpc::Message* response) override; - std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>> AsyncInferenceRequest(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>>(AsyncInferenceRequestRaw(context, request, cq)); - } - std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>> PrepareAsyncInferenceRequest(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) { - return std::unique_ptr< ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>>(PrepareAsyncInferenceRequestRaw(context, request, cq)); - } - - private: - std::shared_ptr< ::grpc::ChannelInterface> channel_; - ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* AsyncRegisterGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) override; - ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* PrepareAsyncRegisterGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) override; - ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* AsyncForgetGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) override; - ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* PrepareAsyncForgetGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) override; - ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* AsyncReplaceGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) override; - ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* PrepareAsyncReplaceGraphRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) override; - ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* AsyncInferenceRequestRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) override; - ::grpc::ClientAsyncResponseReader< flatbuffers::grpc::Message>* PrepareAsyncInferenceRequestRaw(::grpc::ClientContext* context, const flatbuffers::grpc::Message& request, ::grpc::CompletionQueue* cq) override; - const ::grpc::internal::RpcMethod rpcmethod_RegisterGraph_; - const ::grpc::internal::RpcMethod rpcmethod_ForgetGraph_; - const ::grpc::internal::RpcMethod rpcmethod_ReplaceGraph_; - const ::grpc::internal::RpcMethod rpcmethod_InferenceRequest_; - }; - static std::unique_ptr NewStub(const std::shared_ptr< ::grpc::ChannelInterface>& channel, const ::grpc::StubOptions& options = ::grpc::StubOptions()); - - class Service : public ::grpc::Service { - public: - Service(); - virtual ~Service(); - virtual ::grpc::Status RegisterGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response); - virtual ::grpc::Status ForgetGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response); - virtual ::grpc::Status ReplaceGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response); - virtual ::grpc::Status InferenceRequest(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response); - }; - template - class WithAsyncMethod_RegisterGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithAsyncMethod_RegisterGraph() { - ::grpc::Service::MarkMethodAsync(0); - } - ~WithAsyncMethod_RegisterGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable synchronous version of this method - ::grpc::Status RegisterGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - void RequestRegisterGraph(::grpc::ServerContext* context, flatbuffers::grpc::Message* request, ::grpc::ServerAsyncResponseWriter< flatbuffers::grpc::Message>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) { - ::grpc::Service::RequestAsyncUnary(0, context, request, response, new_call_cq, notification_cq, tag); - } - }; - template - class WithAsyncMethod_ForgetGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithAsyncMethod_ForgetGraph() { - ::grpc::Service::MarkMethodAsync(1); - } - ~WithAsyncMethod_ForgetGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable synchronous version of this method - ::grpc::Status ForgetGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - void RequestForgetGraph(::grpc::ServerContext* context, flatbuffers::grpc::Message* request, ::grpc::ServerAsyncResponseWriter< flatbuffers::grpc::Message>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) { - ::grpc::Service::RequestAsyncUnary(1, context, request, response, new_call_cq, notification_cq, tag); - } - }; - template - class WithAsyncMethod_ReplaceGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithAsyncMethod_ReplaceGraph() { - ::grpc::Service::MarkMethodAsync(2); - } - ~WithAsyncMethod_ReplaceGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable synchronous version of this method - ::grpc::Status ReplaceGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - void RequestReplaceGraph(::grpc::ServerContext* context, flatbuffers::grpc::Message* request, ::grpc::ServerAsyncResponseWriter< flatbuffers::grpc::Message>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) { - ::grpc::Service::RequestAsyncUnary(2, context, request, response, new_call_cq, notification_cq, tag); - } - }; - template - class WithAsyncMethod_InferenceRequest : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithAsyncMethod_InferenceRequest() { - ::grpc::Service::MarkMethodAsync(3); - } - ~WithAsyncMethod_InferenceRequest() override { - BaseClassMustBeDerivedFromService(this); - } - // disable synchronous version of this method - ::grpc::Status InferenceRequest(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - void RequestInferenceRequest(::grpc::ServerContext* context, flatbuffers::grpc::Message* request, ::grpc::ServerAsyncResponseWriter< flatbuffers::grpc::Message>* response, ::grpc::CompletionQueue* new_call_cq, ::grpc::ServerCompletionQueue* notification_cq, void *tag) { - ::grpc::Service::RequestAsyncUnary(3, context, request, response, new_call_cq, notification_cq, tag); - } - }; - typedef WithAsyncMethod_RegisterGraph< WithAsyncMethod_ForgetGraph< WithAsyncMethod_ReplaceGraph< WithAsyncMethod_InferenceRequest< Service > > > > AsyncService; - template - class WithGenericMethod_RegisterGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithGenericMethod_RegisterGraph() { - ::grpc::Service::MarkMethodGeneric(0); - } - ~WithGenericMethod_RegisterGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable synchronous version of this method - ::grpc::Status RegisterGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - }; - template - class WithGenericMethod_ForgetGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithGenericMethod_ForgetGraph() { - ::grpc::Service::MarkMethodGeneric(1); - } - ~WithGenericMethod_ForgetGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable synchronous version of this method - ::grpc::Status ForgetGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - }; - template - class WithGenericMethod_ReplaceGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithGenericMethod_ReplaceGraph() { - ::grpc::Service::MarkMethodGeneric(2); - } - ~WithGenericMethod_ReplaceGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable synchronous version of this method - ::grpc::Status ReplaceGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - }; - template - class WithGenericMethod_InferenceRequest : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithGenericMethod_InferenceRequest() { - ::grpc::Service::MarkMethodGeneric(3); - } - ~WithGenericMethod_InferenceRequest() override { - BaseClassMustBeDerivedFromService(this); - } - // disable synchronous version of this method - ::grpc::Status InferenceRequest(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - }; - template - class WithStreamedUnaryMethod_RegisterGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithStreamedUnaryMethod_RegisterGraph() { - ::grpc::Service::MarkMethodStreamed(0, - new ::grpc::internal::StreamedUnaryHandler< flatbuffers::grpc::Message, flatbuffers::grpc::Message>(std::bind(&WithStreamedUnaryMethod_RegisterGraph::StreamedRegisterGraph, this, std::placeholders::_1, std::placeholders::_2))); - } - ~WithStreamedUnaryMethod_RegisterGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable regular version of this method - ::grpc::Status RegisterGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - // replace default version of method with streamed unary - virtual ::grpc::Status StreamedRegisterGraph(::grpc::ServerContext* context, ::grpc::ServerUnaryStreamer< flatbuffers::grpc::Message,flatbuffers::grpc::Message>* server_unary_streamer) = 0; - }; - template - class WithStreamedUnaryMethod_ForgetGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithStreamedUnaryMethod_ForgetGraph() { - ::grpc::Service::MarkMethodStreamed(1, - new ::grpc::internal::StreamedUnaryHandler< flatbuffers::grpc::Message, flatbuffers::grpc::Message>(std::bind(&WithStreamedUnaryMethod_ForgetGraph::StreamedForgetGraph, this, std::placeholders::_1, std::placeholders::_2))); - } - ~WithStreamedUnaryMethod_ForgetGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable regular version of this method - ::grpc::Status ForgetGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - // replace default version of method with streamed unary - virtual ::grpc::Status StreamedForgetGraph(::grpc::ServerContext* context, ::grpc::ServerUnaryStreamer< flatbuffers::grpc::Message,flatbuffers::grpc::Message>* server_unary_streamer) = 0; - }; - template - class WithStreamedUnaryMethod_ReplaceGraph : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithStreamedUnaryMethod_ReplaceGraph() { - ::grpc::Service::MarkMethodStreamed(2, - new ::grpc::internal::StreamedUnaryHandler< flatbuffers::grpc::Message, flatbuffers::grpc::Message>(std::bind(&WithStreamedUnaryMethod_ReplaceGraph::StreamedReplaceGraph, this, std::placeholders::_1, std::placeholders::_2))); - } - ~WithStreamedUnaryMethod_ReplaceGraph() override { - BaseClassMustBeDerivedFromService(this); - } - // disable regular version of this method - ::grpc::Status ReplaceGraph(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - // replace default version of method with streamed unary - virtual ::grpc::Status StreamedReplaceGraph(::grpc::ServerContext* context, ::grpc::ServerUnaryStreamer< flatbuffers::grpc::Message,flatbuffers::grpc::Message>* server_unary_streamer) = 0; - }; - template - class WithStreamedUnaryMethod_InferenceRequest : public BaseClass { - private: - void BaseClassMustBeDerivedFromService(const Service *service) {} - public: - WithStreamedUnaryMethod_InferenceRequest() { - ::grpc::Service::MarkMethodStreamed(3, - new ::grpc::internal::StreamedUnaryHandler< flatbuffers::grpc::Message, flatbuffers::grpc::Message>(std::bind(&WithStreamedUnaryMethod_InferenceRequest::StreamedInferenceRequest, this, std::placeholders::_1, std::placeholders::_2))); - } - ~WithStreamedUnaryMethod_InferenceRequest() override { - BaseClassMustBeDerivedFromService(this); - } - // disable regular version of this method - ::grpc::Status InferenceRequest(::grpc::ServerContext* context, const flatbuffers::grpc::Message* request, flatbuffers::grpc::Message* response) final override { - abort(); - return ::grpc::Status(::grpc::StatusCode::UNIMPLEMENTED, ""); - } - // replace default version of method with streamed unary - virtual ::grpc::Status StreamedInferenceRequest(::grpc::ServerContext* context, ::grpc::ServerUnaryStreamer< flatbuffers::grpc::Message,flatbuffers::grpc::Message>* server_unary_streamer) = 0; - }; - typedef WithStreamedUnaryMethod_RegisterGraph< WithStreamedUnaryMethod_ForgetGraph< WithStreamedUnaryMethod_ReplaceGraph< WithStreamedUnaryMethod_InferenceRequest< Service > > > > StreamedUnaryService; - typedef Service SplitStreamedService; - typedef WithStreamedUnaryMethod_RegisterGraph< WithStreamedUnaryMethod_ForgetGraph< WithStreamedUnaryMethod_ReplaceGraph< WithStreamedUnaryMethod_InferenceRequest< Service > > > > StreamedService; -}; - -} // namespace graph -} // namespace sd - - -#endif // GRPC_graph__INCLUDED diff --git a/libnd4j/include/graph/generated/graph_generated.h b/libnd4j/include/graph/generated/graph_generated.h deleted file mode 100644 index 1285e4607..000000000 --- a/libnd4j/include/graph/generated/graph_generated.h +++ /dev/null @@ -1,377 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_GRAPH_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_GRAPH_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -#include "array_generated.h" -#include "config_generated.h" -#include "node_generated.h" -#include "properties_generated.h" -#include "request_generated.h" -#include "result_generated.h" -#include "utils_generated.h" -#include "variable_generated.h" - -namespace sd { -namespace graph { - -struct UpdaterState; - -struct FlatGraph; - -struct FlatDropRequest; - -struct FlatResponse; - -struct UpdaterState FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_PARAMNAME = 4, - VT_UPDATERSTATEKEYS = 6, - VT_UPDATERSTATEVALUES = 8 - }; - const flatbuffers::String *paramName() const { - return GetPointer(VT_PARAMNAME); - } - const flatbuffers::Vector> *updaterStateKeys() const { - return GetPointer> *>(VT_UPDATERSTATEKEYS); - } - const flatbuffers::Vector> *updaterStateValues() const { - return GetPointer> *>(VT_UPDATERSTATEVALUES); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_PARAMNAME) && - verifier.VerifyString(paramName()) && - VerifyOffset(verifier, VT_UPDATERSTATEKEYS) && - verifier.VerifyVector(updaterStateKeys()) && - verifier.VerifyVectorOfStrings(updaterStateKeys()) && - VerifyOffset(verifier, VT_UPDATERSTATEVALUES) && - verifier.VerifyVector(updaterStateValues()) && - verifier.VerifyVectorOfTables(updaterStateValues()) && - verifier.EndTable(); - } -}; - -struct UpdaterStateBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_paramName(flatbuffers::Offset paramName) { - fbb_.AddOffset(UpdaterState::VT_PARAMNAME, paramName); - } - void add_updaterStateKeys(flatbuffers::Offset>> updaterStateKeys) { - fbb_.AddOffset(UpdaterState::VT_UPDATERSTATEKEYS, updaterStateKeys); - } - void add_updaterStateValues(flatbuffers::Offset>> updaterStateValues) { - fbb_.AddOffset(UpdaterState::VT_UPDATERSTATEVALUES, updaterStateValues); - } - explicit UpdaterStateBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UpdaterStateBuilder &operator=(const UpdaterStateBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUpdaterState( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset paramName = 0, - flatbuffers::Offset>> updaterStateKeys = 0, - flatbuffers::Offset>> updaterStateValues = 0) { - UpdaterStateBuilder builder_(_fbb); - builder_.add_updaterStateValues(updaterStateValues); - builder_.add_updaterStateKeys(updaterStateKeys); - builder_.add_paramName(paramName); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateUpdaterStateDirect( - flatbuffers::FlatBufferBuilder &_fbb, - const char *paramName = nullptr, - const std::vector> *updaterStateKeys = nullptr, - const std::vector> *updaterStateValues = nullptr) { - return sd::graph::CreateUpdaterState( - _fbb, - paramName ? _fbb.CreateString(paramName) : 0, - updaterStateKeys ? _fbb.CreateVector>(*updaterStateKeys) : 0, - updaterStateValues ? _fbb.CreateVector>(*updaterStateValues) : 0); -} - -struct FlatGraph FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4, - VT_VARIABLES = 6, - VT_NODES = 8, - VT_OUTPUTS = 10, - VT_CONFIGURATION = 12, - VT_PLACEHOLDERS = 14, - VT_LOSSVARIABLES = 16, - VT_TRAININGCONFIG = 18, - VT_UPDATERSTATE = 20 - }; - int64_t id() const { - return GetField(VT_ID, 0); - } - const flatbuffers::Vector> *variables() const { - return GetPointer> *>(VT_VARIABLES); - } - const flatbuffers::Vector> *nodes() const { - return GetPointer> *>(VT_NODES); - } - const flatbuffers::Vector> *outputs() const { - return GetPointer> *>(VT_OUTPUTS); - } - const FlatConfiguration *configuration() const { - return GetPointer(VT_CONFIGURATION); - } - const flatbuffers::Vector> *placeholders() const { - return GetPointer> *>(VT_PLACEHOLDERS); - } - const flatbuffers::Vector> *lossVariables() const { - return GetPointer> *>(VT_LOSSVARIABLES); - } - const flatbuffers::String *trainingConfig() const { - return GetPointer(VT_TRAININGCONFIG); - } - const flatbuffers::Vector> *updaterState() const { - return GetPointer> *>(VT_UPDATERSTATE); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_ID) && - VerifyOffset(verifier, VT_VARIABLES) && - verifier.VerifyVector(variables()) && - verifier.VerifyVectorOfTables(variables()) && - VerifyOffset(verifier, VT_NODES) && - verifier.VerifyVector(nodes()) && - verifier.VerifyVectorOfTables(nodes()) && - VerifyOffset(verifier, VT_OUTPUTS) && - verifier.VerifyVector(outputs()) && - verifier.VerifyVectorOfTables(outputs()) && - VerifyOffset(verifier, VT_CONFIGURATION) && - verifier.VerifyTable(configuration()) && - VerifyOffset(verifier, VT_PLACEHOLDERS) && - verifier.VerifyVector(placeholders()) && - verifier.VerifyVectorOfStrings(placeholders()) && - VerifyOffset(verifier, VT_LOSSVARIABLES) && - verifier.VerifyVector(lossVariables()) && - verifier.VerifyVectorOfStrings(lossVariables()) && - VerifyOffset(verifier, VT_TRAININGCONFIG) && - verifier.VerifyString(trainingConfig()) && - VerifyOffset(verifier, VT_UPDATERSTATE) && - verifier.VerifyVector(updaterState()) && - verifier.VerifyVectorOfTables(updaterState()) && - verifier.EndTable(); - } -}; - -struct FlatGraphBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(int64_t id) { - fbb_.AddElement(FlatGraph::VT_ID, id, 0); - } - void add_variables(flatbuffers::Offset>> variables) { - fbb_.AddOffset(FlatGraph::VT_VARIABLES, variables); - } - void add_nodes(flatbuffers::Offset>> nodes) { - fbb_.AddOffset(FlatGraph::VT_NODES, nodes); - } - void add_outputs(flatbuffers::Offset>> outputs) { - fbb_.AddOffset(FlatGraph::VT_OUTPUTS, outputs); - } - void add_configuration(flatbuffers::Offset configuration) { - fbb_.AddOffset(FlatGraph::VT_CONFIGURATION, configuration); - } - void add_placeholders(flatbuffers::Offset>> placeholders) { - fbb_.AddOffset(FlatGraph::VT_PLACEHOLDERS, placeholders); - } - void add_lossVariables(flatbuffers::Offset>> lossVariables) { - fbb_.AddOffset(FlatGraph::VT_LOSSVARIABLES, lossVariables); - } - void add_trainingConfig(flatbuffers::Offset trainingConfig) { - fbb_.AddOffset(FlatGraph::VT_TRAININGCONFIG, trainingConfig); - } - void add_updaterState(flatbuffers::Offset>> updaterState) { - fbb_.AddOffset(FlatGraph::VT_UPDATERSTATE, updaterState); - } - explicit FlatGraphBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatGraphBuilder &operator=(const FlatGraphBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatGraph( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t id = 0, - flatbuffers::Offset>> variables = 0, - flatbuffers::Offset>> nodes = 0, - flatbuffers::Offset>> outputs = 0, - flatbuffers::Offset configuration = 0, - flatbuffers::Offset>> placeholders = 0, - flatbuffers::Offset>> lossVariables = 0, - flatbuffers::Offset trainingConfig = 0, - flatbuffers::Offset>> updaterState = 0) { - FlatGraphBuilder builder_(_fbb); - builder_.add_id(id); - builder_.add_updaterState(updaterState); - builder_.add_trainingConfig(trainingConfig); - builder_.add_lossVariables(lossVariables); - builder_.add_placeholders(placeholders); - builder_.add_configuration(configuration); - builder_.add_outputs(outputs); - builder_.add_nodes(nodes); - builder_.add_variables(variables); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatGraphDirect( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t id = 0, - const std::vector> *variables = nullptr, - const std::vector> *nodes = nullptr, - const std::vector> *outputs = nullptr, - flatbuffers::Offset configuration = 0, - const std::vector> *placeholders = nullptr, - const std::vector> *lossVariables = nullptr, - const char *trainingConfig = nullptr, - const std::vector> *updaterState = nullptr) { - return sd::graph::CreateFlatGraph( - _fbb, - id, - variables ? _fbb.CreateVector>(*variables) : 0, - nodes ? _fbb.CreateVector>(*nodes) : 0, - outputs ? _fbb.CreateVector>(*outputs) : 0, - configuration, - placeholders ? _fbb.CreateVector>(*placeholders) : 0, - lossVariables ? _fbb.CreateVector>(*lossVariables) : 0, - trainingConfig ? _fbb.CreateString(trainingConfig) : 0, - updaterState ? _fbb.CreateVector>(*updaterState) : 0); -} - -struct FlatDropRequest FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4 - }; - int64_t id() const { - return GetField(VT_ID, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_ID) && - verifier.EndTable(); - } -}; - -struct FlatDropRequestBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(int64_t id) { - fbb_.AddElement(FlatDropRequest::VT_ID, id, 0); - } - explicit FlatDropRequestBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatDropRequestBuilder &operator=(const FlatDropRequestBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatDropRequest( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t id = 0) { - FlatDropRequestBuilder builder_(_fbb); - builder_.add_id(id); - return builder_.Finish(); -} - -struct FlatResponse FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_STATUS = 4 - }; - int32_t status() const { - return GetField(VT_STATUS, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_STATUS) && - verifier.EndTable(); - } -}; - -struct FlatResponseBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_status(int32_t status) { - fbb_.AddElement(FlatResponse::VT_STATUS, status, 0); - } - explicit FlatResponseBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatResponseBuilder &operator=(const FlatResponseBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatResponse( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t status = 0) { - FlatResponseBuilder builder_(_fbb); - builder_.add_status(status); - return builder_.Finish(); -} - -inline const sd::graph::FlatGraph *GetFlatGraph(const void *buf) { - return flatbuffers::GetRoot(buf); -} - -inline const sd::graph::FlatGraph *GetSizePrefixedFlatGraph(const void *buf) { - return flatbuffers::GetSizePrefixedRoot(buf); -} - -inline bool VerifyFlatGraphBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifyBuffer(nullptr); -} - -inline bool VerifySizePrefixedFlatGraphBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifySizePrefixedBuffer(nullptr); -} - -inline void FinishFlatGraphBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.Finish(root); -} - -inline void FinishSizePrefixedFlatGraphBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.FinishSizePrefixed(root); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_GRAPH_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/graph_generated.js b/libnd4j/include/graph/generated/graph_generated.js deleted file mode 100644 index 24e5cea9d..000000000 --- a/libnd4j/include/graph/generated/graph_generated.js +++ /dev/null @@ -1,721 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @constructor - */ -nd4j.graph.UpdaterState = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UpdaterState} - */ -nd4j.graph.UpdaterState.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UpdaterState=} obj - * @returns {nd4j.graph.UpdaterState} - */ -nd4j.graph.UpdaterState.getRootAsUpdaterState = function(bb, obj) { - return (obj || new nd4j.graph.UpdaterState).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UpdaterState.prototype.paramName = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UpdaterState.prototype.updaterStateKeys = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UpdaterState.prototype.updaterStateKeysLength = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray} - */ -nd4j.graph.UpdaterState.prototype.updaterStateValues = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UpdaterState.prototype.updaterStateValuesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UpdaterState.startUpdaterState = function(builder) { - builder.startObject(3); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} paramNameOffset - */ -nd4j.graph.UpdaterState.addParamName = function(builder, paramNameOffset) { - builder.addFieldOffset(0, paramNameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} updaterStateKeysOffset - */ -nd4j.graph.UpdaterState.addUpdaterStateKeys = function(builder, updaterStateKeysOffset) { - builder.addFieldOffset(1, updaterStateKeysOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UpdaterState.createUpdaterStateKeysVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UpdaterState.startUpdaterStateKeysVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} updaterStateValuesOffset - */ -nd4j.graph.UpdaterState.addUpdaterStateValues = function(builder, updaterStateValuesOffset) { - builder.addFieldOffset(2, updaterStateValuesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UpdaterState.createUpdaterStateValuesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UpdaterState.startUpdaterStateValuesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UpdaterState.endUpdaterState = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.FlatGraph = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatGraph} - */ -nd4j.graph.FlatGraph.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatGraph=} obj - * @returns {nd4j.graph.FlatGraph} - */ -nd4j.graph.FlatGraph.getRootAsFlatGraph = function(bb, obj) { - return (obj || new nd4j.graph.FlatGraph).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatGraph.prototype.id = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatVariable=} obj - * @returns {nd4j.graph.FlatVariable} - */ -nd4j.graph.FlatGraph.prototype.variables = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? (obj || new nd4j.graph.FlatVariable).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatGraph.prototype.variablesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatNode=} obj - * @returns {nd4j.graph.FlatNode} - */ -nd4j.graph.FlatGraph.prototype.nodes = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? (obj || new nd4j.graph.FlatNode).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatGraph.prototype.nodesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {nd4j.graph.IntPair=} obj - * @returns {nd4j.graph.IntPair} - */ -nd4j.graph.FlatGraph.prototype.outputs = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? (obj || new nd4j.graph.IntPair).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatGraph.prototype.outputsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {nd4j.graph.FlatConfiguration=} obj - * @returns {nd4j.graph.FlatConfiguration|null} - */ -nd4j.graph.FlatGraph.prototype.configuration = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? (obj || new nd4j.graph.FlatConfiguration).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatGraph.prototype.placeholders = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatGraph.prototype.placeholdersLength = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatGraph.prototype.lossVariables = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatGraph.prototype.lossVariablesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.FlatGraph.prototype.trainingConfig = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {number} index - * @param {nd4j.graph.UpdaterState=} obj - * @returns {nd4j.graph.UpdaterState} - */ -nd4j.graph.FlatGraph.prototype.updaterState = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? (obj || new nd4j.graph.UpdaterState).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatGraph.prototype.updaterStateLength = function() { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatGraph.startFlatGraph = function(builder) { - builder.startObject(9); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} id - */ -nd4j.graph.FlatGraph.addId = function(builder, id) { - builder.addFieldInt64(0, id, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} variablesOffset - */ -nd4j.graph.FlatGraph.addVariables = function(builder, variablesOffset) { - builder.addFieldOffset(1, variablesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatGraph.createVariablesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatGraph.startVariablesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} nodesOffset - */ -nd4j.graph.FlatGraph.addNodes = function(builder, nodesOffset) { - builder.addFieldOffset(2, nodesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatGraph.createNodesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatGraph.startNodesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} outputsOffset - */ -nd4j.graph.FlatGraph.addOutputs = function(builder, outputsOffset) { - builder.addFieldOffset(3, outputsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatGraph.createOutputsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatGraph.startOutputsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} configurationOffset - */ -nd4j.graph.FlatGraph.addConfiguration = function(builder, configurationOffset) { - builder.addFieldOffset(4, configurationOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} placeholdersOffset - */ -nd4j.graph.FlatGraph.addPlaceholders = function(builder, placeholdersOffset) { - builder.addFieldOffset(5, placeholdersOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatGraph.createPlaceholdersVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatGraph.startPlaceholdersVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} lossVariablesOffset - */ -nd4j.graph.FlatGraph.addLossVariables = function(builder, lossVariablesOffset) { - builder.addFieldOffset(6, lossVariablesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatGraph.createLossVariablesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatGraph.startLossVariablesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} trainingConfigOffset - */ -nd4j.graph.FlatGraph.addTrainingConfig = function(builder, trainingConfigOffset) { - builder.addFieldOffset(7, trainingConfigOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} updaterStateOffset - */ -nd4j.graph.FlatGraph.addUpdaterState = function(builder, updaterStateOffset) { - builder.addFieldOffset(8, updaterStateOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatGraph.createUpdaterStateVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatGraph.startUpdaterStateVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatGraph.endFlatGraph = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} offset - */ -nd4j.graph.FlatGraph.finishFlatGraphBuffer = function(builder, offset) { - builder.finish(offset); -}; - -/** - * @constructor - */ -nd4j.graph.FlatDropRequest = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatDropRequest} - */ -nd4j.graph.FlatDropRequest.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatDropRequest=} obj - * @returns {nd4j.graph.FlatDropRequest} - */ -nd4j.graph.FlatDropRequest.getRootAsFlatDropRequest = function(bb, obj) { - return (obj || new nd4j.graph.FlatDropRequest).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatDropRequest.prototype.id = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatDropRequest.startFlatDropRequest = function(builder) { - builder.startObject(1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} id - */ -nd4j.graph.FlatDropRequest.addId = function(builder, id) { - builder.addFieldInt64(0, id, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatDropRequest.endFlatDropRequest = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.FlatResponse = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatResponse} - */ -nd4j.graph.FlatResponse.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatResponse=} obj - * @returns {nd4j.graph.FlatResponse} - */ -nd4j.graph.FlatResponse.getRootAsFlatResponse = function(bb, obj) { - return (obj || new nd4j.graph.FlatResponse).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatResponse.prototype.status = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatResponse.startFlatResponse = function(builder) { - builder.startObject(1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} status - */ -nd4j.graph.FlatResponse.addStatus = function(builder, status) { - builder.addFieldInt32(0, status, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatResponse.endFlatResponse = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/nd4j/__init__.py b/libnd4j/include/graph/generated/nd4j/__init__.py deleted file mode 100644 index ecf2a1c25..000000000 --- a/libnd4j/include/graph/generated/nd4j/__init__.py +++ /dev/null @@ -1,18 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - diff --git a/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.cs b/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.cs deleted file mode 100644 index 5b20d143f..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.cs +++ /dev/null @@ -1,15 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum ByteOrder : sbyte -{ - LE = 0, - BE = 1, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.java b/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.java deleted file mode 100644 index b0d703b61..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.java +++ /dev/null @@ -1,14 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class ByteOrder { - private ByteOrder() { } - public static final byte LE = 0; - public static final byte BE = 1; - - public static final String[] names = { "LE", "BE", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.py b/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.py deleted file mode 100644 index 8e3a0f0fb..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ByteOrder.py +++ /dev/null @@ -1,22 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class ByteOrder(object): - LE = 0 - BE = 1 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/DType.cs b/libnd4j/include/graph/generated/nd4j/graph/DType.cs deleted file mode 100644 index f6fd2778c..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/DType.cs +++ /dev/null @@ -1,34 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum DType : sbyte -{ - INHERIT = 0, - BOOL = 1, - FLOAT8 = 2, - HALF = 3, - HALF2 = 4, - FLOAT = 5, - DOUBLE = 6, - INT8 = 7, - INT16 = 8, - INT32 = 9, - INT64 = 10, - UINT8 = 11, - UINT16 = 12, - UINT32 = 13, - UINT64 = 14, - QINT8 = 15, - QINT16 = 16, - BFLOAT16 = 17, - UTF8 = 50, - UTF16 = 51, - UTF32 = 52, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/DType.java b/libnd4j/include/graph/generated/nd4j/graph/DType.java deleted file mode 100644 index c1b394ca7..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/DType.java +++ /dev/null @@ -1,33 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class DType { - private DType() { } - public static final byte INHERIT = 0; - public static final byte BOOL = 1; - public static final byte FLOAT8 = 2; - public static final byte HALF = 3; - public static final byte HALF2 = 4; - public static final byte FLOAT = 5; - public static final byte DOUBLE = 6; - public static final byte INT8 = 7; - public static final byte INT16 = 8; - public static final byte INT32 = 9; - public static final byte INT64 = 10; - public static final byte UINT8 = 11; - public static final byte UINT16 = 12; - public static final byte UINT32 = 13; - public static final byte UINT64 = 14; - public static final byte QINT8 = 15; - public static final byte QINT16 = 16; - public static final byte BFLOAT16 = 17; - public static final byte UTF8 = 50; - public static final byte UTF16 = 51; - public static final byte UTF32 = 52; - - public static final String[] names = { "INHERIT", "BOOL", "FLOAT8", "HALF", "HALF2", "FLOAT", "DOUBLE", "INT8", "INT16", "INT32", "INT64", "UINT8", "UINT16", "UINT32", "UINT64", "QINT8", "QINT16", "BFLOAT16", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "UTF8", "UTF16", "UTF32", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/DType.py b/libnd4j/include/graph/generated/nd4j/graph/DType.py deleted file mode 100644 index d15188647..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/DType.py +++ /dev/null @@ -1,41 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class DType(object): - INHERIT = 0 - BOOL = 1 - FLOAT8 = 2 - HALF = 3 - HALF2 = 4 - FLOAT = 5 - DOUBLE = 6 - INT8 = 7 - INT16 = 8 - INT32 = 9 - INT64 = 10 - UINT8 = 11 - UINT16 = 12 - UINT32 = 13 - UINT64 = 14 - QINT8 = 15 - QINT16 = 16 - BFLOAT16 = 17 - UTF8 = 50 - UTF16 = 51 - UTF32 = 52 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/Direction.cs b/libnd4j/include/graph/generated/nd4j/graph/Direction.cs deleted file mode 100644 index d93c1e947..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/Direction.cs +++ /dev/null @@ -1,16 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum Direction : sbyte -{ - FORWARD_ONLY = 0, - FORWARD_AND_BACKWARD = 1, - BACKWARD_ONLY = 2, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/Direction.java b/libnd4j/include/graph/generated/nd4j/graph/Direction.java deleted file mode 100644 index dd53517f6..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/Direction.java +++ /dev/null @@ -1,15 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class Direction { - private Direction() { } - public static final byte FORWARD_ONLY = 0; - public static final byte FORWARD_AND_BACKWARD = 1; - public static final byte BACKWARD_ONLY = 2; - - public static final String[] names = { "FORWARD_ONLY", "FORWARD_AND_BACKWARD", "BACKWARD_ONLY", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/Direction.py b/libnd4j/include/graph/generated/nd4j/graph/Direction.py deleted file mode 100644 index 9c6771d99..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/Direction.py +++ /dev/null @@ -1,23 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class Direction(object): - FORWARD_ONLY = 0 - FORWARD_AND_BACKWARD = 1 - BACKWARD_ONLY = 2 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.cs b/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.cs deleted file mode 100644 index 77f54df5b..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.cs +++ /dev/null @@ -1,16 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum ExecutionMode : sbyte -{ - SEQUENTIAL = 0, - STRICT = 1, - AUTO = 2, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.java b/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.java deleted file mode 100644 index 5db2864ef..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.java +++ /dev/null @@ -1,15 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class ExecutionMode { - private ExecutionMode() { } - public static final byte SEQUENTIAL = 0; - public static final byte STRICT = 1; - public static final byte AUTO = 2; - - public static final String[] names = { "SEQUENTIAL", "STRICT", "AUTO", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.py b/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.py deleted file mode 100644 index ae66b0bf5..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ExecutionMode.py +++ /dev/null @@ -1,23 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class ExecutionMode(object): - SEQUENTIAL = 0 - STRICT = 1 - AUTO = 2 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatArray.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatArray.cs deleted file mode 100644 index 6a4aad3f2..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatArray.cs +++ /dev/null @@ -1,72 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatArray : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatArray GetRootAsFlatArray(ByteBuffer _bb) { return GetRootAsFlatArray(_bb, new FlatArray()); } - public static FlatArray GetRootAsFlatArray(ByteBuffer _bb, FlatArray obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatArray __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long Shape(int j) { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(__p.__vector(o) + j * 8) : (long)0; } - public int ShapeLength { get { int o = __p.__offset(4); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetShapeBytes() { return __p.__vector_as_span(4); } -#else - public ArraySegment? GetShapeBytes() { return __p.__vector_as_arraysegment(4); } -#endif - public long[] GetShapeArray() { return __p.__vector_as_array(4); } - public sbyte Buffer(int j) { int o = __p.__offset(6); return o != 0 ? __p.bb.GetSbyte(__p.__vector(o) + j * 1) : (sbyte)0; } - public int BufferLength { get { int o = __p.__offset(6); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetBufferBytes() { return __p.__vector_as_span(6); } -#else - public ArraySegment? GetBufferBytes() { return __p.__vector_as_arraysegment(6); } -#endif - public sbyte[] GetBufferArray() { return __p.__vector_as_array(6); } - public DType Dtype { get { int o = __p.__offset(8); return o != 0 ? (DType)__p.bb.GetSbyte(o + __p.bb_pos) : DType.INHERIT; } } - public ByteOrder ByteOrder { get { int o = __p.__offset(10); return o != 0 ? (ByteOrder)__p.bb.GetSbyte(o + __p.bb_pos) : ByteOrder.LE; } } - - public static Offset CreateFlatArray(FlatBufferBuilder builder, - VectorOffset shapeOffset = default(VectorOffset), - VectorOffset bufferOffset = default(VectorOffset), - DType dtype = DType.INHERIT, - ByteOrder byteOrder = ByteOrder.LE) { - builder.StartObject(4); - FlatArray.AddBuffer(builder, bufferOffset); - FlatArray.AddShape(builder, shapeOffset); - FlatArray.AddByteOrder(builder, byteOrder); - FlatArray.AddDtype(builder, dtype); - return FlatArray.EndFlatArray(builder); - } - - public static void StartFlatArray(FlatBufferBuilder builder) { builder.StartObject(4); } - public static void AddShape(FlatBufferBuilder builder, VectorOffset shapeOffset) { builder.AddOffset(0, shapeOffset.Value, 0); } - public static VectorOffset CreateShapeVector(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); for (int i = data.Length - 1; i >= 0; i--) builder.AddLong(data[i]); return builder.EndVector(); } - public static VectorOffset CreateShapeVectorBlock(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); builder.Add(data); return builder.EndVector(); } - public static void StartShapeVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(8, numElems, 8); } - public static void AddBuffer(FlatBufferBuilder builder, VectorOffset bufferOffset) { builder.AddOffset(1, bufferOffset.Value, 0); } - public static VectorOffset CreateBufferVector(FlatBufferBuilder builder, sbyte[] data) { builder.StartVector(1, data.Length, 1); for (int i = data.Length - 1; i >= 0; i--) builder.AddSbyte(data[i]); return builder.EndVector(); } - public static VectorOffset CreateBufferVectorBlock(FlatBufferBuilder builder, sbyte[] data) { builder.StartVector(1, data.Length, 1); builder.Add(data); return builder.EndVector(); } - public static void StartBufferVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(1, numElems, 1); } - public static void AddDtype(FlatBufferBuilder builder, DType dtype) { builder.AddSbyte(2, (sbyte)dtype, 0); } - public static void AddByteOrder(FlatBufferBuilder builder, ByteOrder byteOrder) { builder.AddSbyte(3, (sbyte)byteOrder, 0); } - public static Offset EndFlatArray(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } - public static void FinishFlatArrayBuffer(FlatBufferBuilder builder, Offset offset) { builder.Finish(offset.Value); } - public static void FinishSizePrefixedFlatArrayBuffer(FlatBufferBuilder builder, Offset offset) { builder.FinishSizePrefixed(offset.Value); } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatArray.java b/libnd4j/include/graph/generated/nd4j/graph/FlatArray.java deleted file mode 100644 index 81c43d04a..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatArray.java +++ /dev/null @@ -1,57 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatArray extends Table { - public static FlatArray getRootAsFlatArray(ByteBuffer _bb) { return getRootAsFlatArray(_bb, new FlatArray()); } - public static FlatArray getRootAsFlatArray(ByteBuffer _bb, FlatArray obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatArray __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long shape(int j) { int o = __offset(4); return o != 0 ? bb.getLong(__vector(o) + j * 8) : 0; } - public int shapeLength() { int o = __offset(4); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer shapeAsByteBuffer() { return __vector_as_bytebuffer(4, 8); } - public ByteBuffer shapeInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 4, 8); } - public byte buffer(int j) { int o = __offset(6); return o != 0 ? bb.get(__vector(o) + j * 1) : 0; } - public int bufferLength() { int o = __offset(6); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer bufferAsByteBuffer() { return __vector_as_bytebuffer(6, 1); } - public ByteBuffer bufferInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 6, 1); } - public byte dtype() { int o = __offset(8); return o != 0 ? bb.get(o + bb_pos) : 0; } - public byte byteOrder() { int o = __offset(10); return o != 0 ? bb.get(o + bb_pos) : 0; } - - public static int createFlatArray(FlatBufferBuilder builder, - int shapeOffset, - int bufferOffset, - byte dtype, - byte byteOrder) { - builder.startObject(4); - FlatArray.addBuffer(builder, bufferOffset); - FlatArray.addShape(builder, shapeOffset); - FlatArray.addByteOrder(builder, byteOrder); - FlatArray.addDtype(builder, dtype); - return FlatArray.endFlatArray(builder); - } - - public static void startFlatArray(FlatBufferBuilder builder) { builder.startObject(4); } - public static void addShape(FlatBufferBuilder builder, int shapeOffset) { builder.addOffset(0, shapeOffset, 0); } - public static int createShapeVector(FlatBufferBuilder builder, long[] data) { builder.startVector(8, data.length, 8); for (int i = data.length - 1; i >= 0; i--) builder.addLong(data[i]); return builder.endVector(); } - public static void startShapeVector(FlatBufferBuilder builder, int numElems) { builder.startVector(8, numElems, 8); } - public static void addBuffer(FlatBufferBuilder builder, int bufferOffset) { builder.addOffset(1, bufferOffset, 0); } - public static int createBufferVector(FlatBufferBuilder builder, byte[] data) { builder.startVector(1, data.length, 1); for (int i = data.length - 1; i >= 0; i--) builder.addByte(data[i]); return builder.endVector(); } - public static void startBufferVector(FlatBufferBuilder builder, int numElems) { builder.startVector(1, numElems, 1); } - public static void addDtype(FlatBufferBuilder builder, byte dtype) { builder.addByte(2, dtype, 0); } - public static void addByteOrder(FlatBufferBuilder builder, byte byteOrder) { builder.addByte(3, byteOrder, 0); } - public static int endFlatArray(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } - public static void finishFlatArrayBuffer(FlatBufferBuilder builder, int offset) { builder.finish(offset); } - public static void finishSizePrefixedFlatArrayBuffer(FlatBufferBuilder builder, int offset) { builder.finishSizePrefixed(offset); } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatArray.py b/libnd4j/include/graph/generated/nd4j/graph/FlatArray.py deleted file mode 100644 index c6ff07d5f..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatArray.py +++ /dev/null @@ -1,100 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatArray(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatArray(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatArray() - x.Init(buf, n + offset) - return x - - # FlatArray - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatArray - def Shape(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int64Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 8)) - return 0 - - # FlatArray - def ShapeAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int64Flags, o) - return 0 - - # FlatArray - def ShapeLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatArray - def Buffer(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int8Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 1)) - return 0 - - # FlatArray - def BufferAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int8Flags, o) - return 0 - - # FlatArray - def BufferLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatArray - def Dtype(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # FlatArray - def ByteOrder(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - -def FlatArrayStart(builder): builder.StartObject(4) -def FlatArrayAddShape(builder, shape): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(shape), 0) -def FlatArrayStartShapeVector(builder, numElems): return builder.StartVector(8, numElems, 8) -def FlatArrayAddBuffer(builder, buffer): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(buffer), 0) -def FlatArrayStartBufferVector(builder, numElems): return builder.StartVector(1, numElems, 1) -def FlatArrayAddDtype(builder, dtype): builder.PrependInt8Slot(2, dtype, 0) -def FlatArrayAddByteOrder(builder, byteOrder): builder.PrependInt8Slot(3, byteOrder, 0) -def FlatArrayEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.cs deleted file mode 100644 index dd2e17d32..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.cs +++ /dev/null @@ -1,42 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatArrayList : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatArrayList GetRootAsFlatArrayList(ByteBuffer _bb) { return GetRootAsFlatArrayList(_bb, new FlatArrayList()); } - public static FlatArrayList GetRootAsFlatArrayList(ByteBuffer _bb, FlatArrayList obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatArrayList __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public FlatArray? List(int j) { int o = __p.__offset(4); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int ListLength { get { int o = __p.__offset(4); return o != 0 ? __p.__vector_len(o) : 0; } } - - public static Offset CreateFlatArrayList(FlatBufferBuilder builder, - VectorOffset listOffset = default(VectorOffset)) { - builder.StartObject(1); - FlatArrayList.AddList(builder, listOffset); - return FlatArrayList.EndFlatArrayList(builder); - } - - public static void StartFlatArrayList(FlatBufferBuilder builder) { builder.StartObject(1); } - public static void AddList(FlatBufferBuilder builder, VectorOffset listOffset) { builder.AddOffset(0, listOffset.Value, 0); } - public static VectorOffset CreateListVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateListVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartListVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static Offset EndFlatArrayList(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.java b/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.java deleted file mode 100644 index 74f1dbd29..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.java +++ /dev/null @@ -1,37 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatArrayList extends Table { - public static FlatArrayList getRootAsFlatArrayList(ByteBuffer _bb) { return getRootAsFlatArrayList(_bb, new FlatArrayList()); } - public static FlatArrayList getRootAsFlatArrayList(ByteBuffer _bb, FlatArrayList obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatArrayList __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public FlatArray list(int j) { return list(new FlatArray(), j); } - public FlatArray list(FlatArray obj, int j) { int o = __offset(4); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int listLength() { int o = __offset(4); return o != 0 ? __vector_len(o) : 0; } - - public static int createFlatArrayList(FlatBufferBuilder builder, - int listOffset) { - builder.startObject(1); - FlatArrayList.addList(builder, listOffset); - return FlatArrayList.endFlatArrayList(builder); - } - - public static void startFlatArrayList(FlatBufferBuilder builder) { builder.startObject(1); } - public static void addList(FlatBufferBuilder builder, int listOffset) { builder.addOffset(0, listOffset, 0); } - public static int createListVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startListVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static int endFlatArrayList(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.py b/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.py deleted file mode 100644 index dc0465bc0..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatArrayList.py +++ /dev/null @@ -1,58 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatArrayList(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatArrayList(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatArrayList() - x.Init(buf, n + offset) - return x - - # FlatArrayList - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatArrayList - def List(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatArrayList - def ListLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - -def FlatArrayListStart(builder): builder.StartObject(1) -def FlatArrayListAddList(builder, list): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(list), 0) -def FlatArrayListStartListVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatArrayListEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.cs deleted file mode 100644 index 5c2edbf9b..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.cs +++ /dev/null @@ -1,68 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatConfiguration : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatConfiguration GetRootAsFlatConfiguration(ByteBuffer _bb) { return GetRootAsFlatConfiguration(_bb, new FlatConfiguration()); } - public static FlatConfiguration GetRootAsFlatConfiguration(ByteBuffer _bb, FlatConfiguration obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatConfiguration __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long Id { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public ExecutionMode ExecutionMode { get { int o = __p.__offset(6); return o != 0 ? (ExecutionMode)__p.bb.GetSbyte(o + __p.bb_pos) : ExecutionMode.SEQUENTIAL; } } - public ProfilingMode ProfilingMode { get { int o = __p.__offset(8); return o != 0 ? (ProfilingMode)__p.bb.GetSbyte(o + __p.bb_pos) : ProfilingMode.NONE; } } - public OutputMode OutputMode { get { int o = __p.__offset(10); return o != 0 ? (OutputMode)__p.bb.GetSbyte(o + __p.bb_pos) : OutputMode.IMPLICIT; } } - public bool Timestats { get { int o = __p.__offset(12); return o != 0 ? 0!=__p.bb.Get(o + __p.bb_pos) : (bool)false; } } - public long FootprintForward { get { int o = __p.__offset(14); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long FootprintBackward { get { int o = __p.__offset(16); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public Direction Direction { get { int o = __p.__offset(18); return o != 0 ? (Direction)__p.bb.GetSbyte(o + __p.bb_pos) : Direction.FORWARD_ONLY; } } - - public static Offset CreateFlatConfiguration(FlatBufferBuilder builder, - long id = 0, - ExecutionMode executionMode = ExecutionMode.SEQUENTIAL, - ProfilingMode profilingMode = ProfilingMode.NONE, - OutputMode outputMode = OutputMode.IMPLICIT, - bool timestats = false, - long footprintForward = 0, - long footprintBackward = 0, - Direction direction = Direction.FORWARD_ONLY) { - builder.StartObject(8); - FlatConfiguration.AddFootprintBackward(builder, footprintBackward); - FlatConfiguration.AddFootprintForward(builder, footprintForward); - FlatConfiguration.AddId(builder, id); - FlatConfiguration.AddDirection(builder, direction); - FlatConfiguration.AddTimestats(builder, timestats); - FlatConfiguration.AddOutputMode(builder, outputMode); - FlatConfiguration.AddProfilingMode(builder, profilingMode); - FlatConfiguration.AddExecutionMode(builder, executionMode); - return FlatConfiguration.EndFlatConfiguration(builder); - } - - public static void StartFlatConfiguration(FlatBufferBuilder builder) { builder.StartObject(8); } - public static void AddId(FlatBufferBuilder builder, long id) { builder.AddLong(0, id, 0); } - public static void AddExecutionMode(FlatBufferBuilder builder, ExecutionMode executionMode) { builder.AddSbyte(1, (sbyte)executionMode, 0); } - public static void AddProfilingMode(FlatBufferBuilder builder, ProfilingMode profilingMode) { builder.AddSbyte(2, (sbyte)profilingMode, 0); } - public static void AddOutputMode(FlatBufferBuilder builder, OutputMode outputMode) { builder.AddSbyte(3, (sbyte)outputMode, 0); } - public static void AddTimestats(FlatBufferBuilder builder, bool timestats) { builder.AddBool(4, timestats, false); } - public static void AddFootprintForward(FlatBufferBuilder builder, long footprintForward) { builder.AddLong(5, footprintForward, 0); } - public static void AddFootprintBackward(FlatBufferBuilder builder, long footprintBackward) { builder.AddLong(6, footprintBackward, 0); } - public static void AddDirection(FlatBufferBuilder builder, Direction direction) { builder.AddSbyte(7, (sbyte)direction, 0); } - public static Offset EndFlatConfiguration(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } - public static void FinishFlatConfigurationBuffer(FlatBufferBuilder builder, Offset offset) { builder.Finish(offset.Value); } - public static void FinishSizePrefixedFlatConfigurationBuffer(FlatBufferBuilder builder, Offset offset) { builder.FinishSizePrefixed(offset.Value); } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.java b/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.java deleted file mode 100644 index e104f49d6..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.java +++ /dev/null @@ -1,63 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatConfiguration extends Table { - public static FlatConfiguration getRootAsFlatConfiguration(ByteBuffer _bb) { return getRootAsFlatConfiguration(_bb, new FlatConfiguration()); } - public static FlatConfiguration getRootAsFlatConfiguration(ByteBuffer _bb, FlatConfiguration obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatConfiguration __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long id() { int o = __offset(4); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public byte executionMode() { int o = __offset(6); return o != 0 ? bb.get(o + bb_pos) : 0; } - public byte profilingMode() { int o = __offset(8); return o != 0 ? bb.get(o + bb_pos) : 0; } - public byte outputMode() { int o = __offset(10); return o != 0 ? bb.get(o + bb_pos) : 0; } - public boolean timestats() { int o = __offset(12); return o != 0 ? 0!=bb.get(o + bb_pos) : false; } - public long footprintForward() { int o = __offset(14); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long footprintBackward() { int o = __offset(16); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public byte direction() { int o = __offset(18); return o != 0 ? bb.get(o + bb_pos) : 0; } - - public static int createFlatConfiguration(FlatBufferBuilder builder, - long id, - byte executionMode, - byte profilingMode, - byte outputMode, - boolean timestats, - long footprintForward, - long footprintBackward, - byte direction) { - builder.startObject(8); - FlatConfiguration.addFootprintBackward(builder, footprintBackward); - FlatConfiguration.addFootprintForward(builder, footprintForward); - FlatConfiguration.addId(builder, id); - FlatConfiguration.addDirection(builder, direction); - FlatConfiguration.addTimestats(builder, timestats); - FlatConfiguration.addOutputMode(builder, outputMode); - FlatConfiguration.addProfilingMode(builder, profilingMode); - FlatConfiguration.addExecutionMode(builder, executionMode); - return FlatConfiguration.endFlatConfiguration(builder); - } - - public static void startFlatConfiguration(FlatBufferBuilder builder) { builder.startObject(8); } - public static void addId(FlatBufferBuilder builder, long id) { builder.addLong(0, id, 0L); } - public static void addExecutionMode(FlatBufferBuilder builder, byte executionMode) { builder.addByte(1, executionMode, 0); } - public static void addProfilingMode(FlatBufferBuilder builder, byte profilingMode) { builder.addByte(2, profilingMode, 0); } - public static void addOutputMode(FlatBufferBuilder builder, byte outputMode) { builder.addByte(3, outputMode, 0); } - public static void addTimestats(FlatBufferBuilder builder, boolean timestats) { builder.addBoolean(4, timestats, false); } - public static void addFootprintForward(FlatBufferBuilder builder, long footprintForward) { builder.addLong(5, footprintForward, 0L); } - public static void addFootprintBackward(FlatBufferBuilder builder, long footprintBackward) { builder.addLong(6, footprintBackward, 0L); } - public static void addDirection(FlatBufferBuilder builder, byte direction) { builder.addByte(7, direction, 0); } - public static int endFlatConfiguration(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } - public static void finishFlatConfigurationBuffer(FlatBufferBuilder builder, int offset) { builder.finish(offset); } - public static void finishSizePrefixedFlatConfigurationBuffer(FlatBufferBuilder builder, int offset) { builder.finishSizePrefixed(offset); } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.py b/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.py deleted file mode 100644 index dac07fdec..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatConfiguration.py +++ /dev/null @@ -1,100 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatConfiguration(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatConfiguration(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatConfiguration() - x.Init(buf, n + offset) - return x - - # FlatConfiguration - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatConfiguration - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # FlatConfiguration - def ExecutionMode(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # FlatConfiguration - def ProfilingMode(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # FlatConfiguration - def OutputMode(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # FlatConfiguration - def Timestats(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return bool(self._tab.Get(flatbuffers.number_types.BoolFlags, o + self._tab.Pos)) - return False - - # FlatConfiguration - def FootprintForward(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # FlatConfiguration - def FootprintBackward(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # FlatConfiguration - def Direction(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - -def FlatConfigurationStart(builder): builder.StartObject(8) -def FlatConfigurationAddId(builder, id): builder.PrependInt64Slot(0, id, 0) -def FlatConfigurationAddExecutionMode(builder, executionMode): builder.PrependInt8Slot(1, executionMode, 0) -def FlatConfigurationAddProfilingMode(builder, profilingMode): builder.PrependInt8Slot(2, profilingMode, 0) -def FlatConfigurationAddOutputMode(builder, outputMode): builder.PrependInt8Slot(3, outputMode, 0) -def FlatConfigurationAddTimestats(builder, timestats): builder.PrependBoolSlot(4, timestats, 0) -def FlatConfigurationAddFootprintForward(builder, footprintForward): builder.PrependInt64Slot(5, footprintForward, 0) -def FlatConfigurationAddFootprintBackward(builder, footprintBackward): builder.PrependInt64Slot(6, footprintBackward, 0) -def FlatConfigurationAddDirection(builder, direction): builder.PrependInt8Slot(7, direction, 0) -def FlatConfigurationEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.cs deleted file mode 100644 index 6d2f5f68b..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.cs +++ /dev/null @@ -1,38 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatDropRequest : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatDropRequest GetRootAsFlatDropRequest(ByteBuffer _bb) { return GetRootAsFlatDropRequest(_bb, new FlatDropRequest()); } - public static FlatDropRequest GetRootAsFlatDropRequest(ByteBuffer _bb, FlatDropRequest obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatDropRequest __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long Id { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - - public static Offset CreateFlatDropRequest(FlatBufferBuilder builder, - long id = 0) { - builder.StartObject(1); - FlatDropRequest.AddId(builder, id); - return FlatDropRequest.EndFlatDropRequest(builder); - } - - public static void StartFlatDropRequest(FlatBufferBuilder builder) { builder.StartObject(1); } - public static void AddId(FlatBufferBuilder builder, long id) { builder.AddLong(0, id, 0); } - public static Offset EndFlatDropRequest(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.java b/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.java deleted file mode 100644 index 548722be3..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.java +++ /dev/null @@ -1,33 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatDropRequest extends Table { - public static FlatDropRequest getRootAsFlatDropRequest(ByteBuffer _bb) { return getRootAsFlatDropRequest(_bb, new FlatDropRequest()); } - public static FlatDropRequest getRootAsFlatDropRequest(ByteBuffer _bb, FlatDropRequest obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatDropRequest __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long id() { int o = __offset(4); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - - public static int createFlatDropRequest(FlatBufferBuilder builder, - long id) { - builder.startObject(1); - FlatDropRequest.addId(builder, id); - return FlatDropRequest.endFlatDropRequest(builder); - } - - public static void startFlatDropRequest(FlatBufferBuilder builder) { builder.startObject(1); } - public static void addId(FlatBufferBuilder builder, long id) { builder.addLong(0, id, 0L); } - public static int endFlatDropRequest(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.py b/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.py deleted file mode 100644 index 25da10bc5..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatDropRequest.py +++ /dev/null @@ -1,44 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatDropRequest(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatDropRequest(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatDropRequest() - x.Init(buf, n + offset) - return x - - # FlatDropRequest - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatDropRequest - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - -def FlatDropRequestStart(builder): builder.StartObject(1) -def FlatDropRequestAddId(builder, id): builder.PrependInt64Slot(0, id, 0) -def FlatDropRequestEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.cs deleted file mode 100644 index 15978e301..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.cs +++ /dev/null @@ -1,102 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatGraph : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatGraph GetRootAsFlatGraph(ByteBuffer _bb) { return GetRootAsFlatGraph(_bb, new FlatGraph()); } - public static FlatGraph GetRootAsFlatGraph(ByteBuffer _bb, FlatGraph obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatGraph __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long Id { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public FlatVariable? Variables(int j) { int o = __p.__offset(6); return o != 0 ? (FlatVariable?)(new FlatVariable()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int VariablesLength { get { int o = __p.__offset(6); return o != 0 ? __p.__vector_len(o) : 0; } } - public FlatNode? Nodes(int j) { int o = __p.__offset(8); return o != 0 ? (FlatNode?)(new FlatNode()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int NodesLength { get { int o = __p.__offset(8); return o != 0 ? __p.__vector_len(o) : 0; } } - public IntPair? Outputs(int j) { int o = __p.__offset(10); return o != 0 ? (IntPair?)(new IntPair()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int OutputsLength { get { int o = __p.__offset(10); return o != 0 ? __p.__vector_len(o) : 0; } } - public FlatConfiguration? Configuration { get { int o = __p.__offset(12); return o != 0 ? (FlatConfiguration?)(new FlatConfiguration()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public string Placeholders(int j) { int o = __p.__offset(14); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int PlaceholdersLength { get { int o = __p.__offset(14); return o != 0 ? __p.__vector_len(o) : 0; } } - public string LossVariables(int j) { int o = __p.__offset(16); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int LossVariablesLength { get { int o = __p.__offset(16); return o != 0 ? __p.__vector_len(o) : 0; } } - public string TrainingConfig { get { int o = __p.__offset(18); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetTrainingConfigBytes() { return __p.__vector_as_span(18); } -#else - public ArraySegment? GetTrainingConfigBytes() { return __p.__vector_as_arraysegment(18); } -#endif - public byte[] GetTrainingConfigArray() { return __p.__vector_as_array(18); } - public UpdaterState? UpdaterState(int j) { int o = __p.__offset(20); return o != 0 ? (UpdaterState?)(new UpdaterState()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int UpdaterStateLength { get { int o = __p.__offset(20); return o != 0 ? __p.__vector_len(o) : 0; } } - - public static Offset CreateFlatGraph(FlatBufferBuilder builder, - long id = 0, - VectorOffset variablesOffset = default(VectorOffset), - VectorOffset nodesOffset = default(VectorOffset), - VectorOffset outputsOffset = default(VectorOffset), - Offset configurationOffset = default(Offset), - VectorOffset placeholdersOffset = default(VectorOffset), - VectorOffset lossVariablesOffset = default(VectorOffset), - StringOffset trainingConfigOffset = default(StringOffset), - VectorOffset updaterStateOffset = default(VectorOffset)) { - builder.StartObject(9); - FlatGraph.AddId(builder, id); - FlatGraph.AddUpdaterState(builder, updaterStateOffset); - FlatGraph.AddTrainingConfig(builder, trainingConfigOffset); - FlatGraph.AddLossVariables(builder, lossVariablesOffset); - FlatGraph.AddPlaceholders(builder, placeholdersOffset); - FlatGraph.AddConfiguration(builder, configurationOffset); - FlatGraph.AddOutputs(builder, outputsOffset); - FlatGraph.AddNodes(builder, nodesOffset); - FlatGraph.AddVariables(builder, variablesOffset); - return FlatGraph.EndFlatGraph(builder); - } - - public static void StartFlatGraph(FlatBufferBuilder builder) { builder.StartObject(9); } - public static void AddId(FlatBufferBuilder builder, long id) { builder.AddLong(0, id, 0); } - public static void AddVariables(FlatBufferBuilder builder, VectorOffset variablesOffset) { builder.AddOffset(1, variablesOffset.Value, 0); } - public static VectorOffset CreateVariablesVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateVariablesVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartVariablesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddNodes(FlatBufferBuilder builder, VectorOffset nodesOffset) { builder.AddOffset(2, nodesOffset.Value, 0); } - public static VectorOffset CreateNodesVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateNodesVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartNodesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddOutputs(FlatBufferBuilder builder, VectorOffset outputsOffset) { builder.AddOffset(3, outputsOffset.Value, 0); } - public static VectorOffset CreateOutputsVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateOutputsVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartOutputsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddConfiguration(FlatBufferBuilder builder, Offset configurationOffset) { builder.AddOffset(4, configurationOffset.Value, 0); } - public static void AddPlaceholders(FlatBufferBuilder builder, VectorOffset placeholdersOffset) { builder.AddOffset(5, placeholdersOffset.Value, 0); } - public static VectorOffset CreatePlaceholdersVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreatePlaceholdersVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartPlaceholdersVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddLossVariables(FlatBufferBuilder builder, VectorOffset lossVariablesOffset) { builder.AddOffset(6, lossVariablesOffset.Value, 0); } - public static VectorOffset CreateLossVariablesVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateLossVariablesVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartLossVariablesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddTrainingConfig(FlatBufferBuilder builder, StringOffset trainingConfigOffset) { builder.AddOffset(7, trainingConfigOffset.Value, 0); } - public static void AddUpdaterState(FlatBufferBuilder builder, VectorOffset updaterStateOffset) { builder.AddOffset(8, updaterStateOffset.Value, 0); } - public static VectorOffset CreateUpdaterStateVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateUpdaterStateVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartUpdaterStateVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static Offset EndFlatGraph(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } - public static void FinishFlatGraphBuffer(FlatBufferBuilder builder, Offset offset) { builder.Finish(offset.Value); } - public static void FinishSizePrefixedFlatGraphBuffer(FlatBufferBuilder builder, Offset offset) { builder.FinishSizePrefixed(offset.Value); } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.java b/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.java deleted file mode 100644 index 660d9e431..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.java +++ /dev/null @@ -1,92 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatGraph extends Table { - public static FlatGraph getRootAsFlatGraph(ByteBuffer _bb) { return getRootAsFlatGraph(_bb, new FlatGraph()); } - public static FlatGraph getRootAsFlatGraph(ByteBuffer _bb, FlatGraph obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatGraph __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long id() { int o = __offset(4); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public FlatVariable variables(int j) { return variables(new FlatVariable(), j); } - public FlatVariable variables(FlatVariable obj, int j) { int o = __offset(6); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int variablesLength() { int o = __offset(6); return o != 0 ? __vector_len(o) : 0; } - public FlatNode nodes(int j) { return nodes(new FlatNode(), j); } - public FlatNode nodes(FlatNode obj, int j) { int o = __offset(8); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int nodesLength() { int o = __offset(8); return o != 0 ? __vector_len(o) : 0; } - public IntPair outputs(int j) { return outputs(new IntPair(), j); } - public IntPair outputs(IntPair obj, int j) { int o = __offset(10); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int outputsLength() { int o = __offset(10); return o != 0 ? __vector_len(o) : 0; } - public FlatConfiguration configuration() { return configuration(new FlatConfiguration()); } - public FlatConfiguration configuration(FlatConfiguration obj) { int o = __offset(12); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public String placeholders(int j) { int o = __offset(14); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int placeholdersLength() { int o = __offset(14); return o != 0 ? __vector_len(o) : 0; } - public String lossVariables(int j) { int o = __offset(16); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int lossVariablesLength() { int o = __offset(16); return o != 0 ? __vector_len(o) : 0; } - public String trainingConfig() { int o = __offset(18); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer trainingConfigAsByteBuffer() { return __vector_as_bytebuffer(18, 1); } - public ByteBuffer trainingConfigInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 18, 1); } - public UpdaterState updaterState(int j) { return updaterState(new UpdaterState(), j); } - public UpdaterState updaterState(UpdaterState obj, int j) { int o = __offset(20); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int updaterStateLength() { int o = __offset(20); return o != 0 ? __vector_len(o) : 0; } - - public static int createFlatGraph(FlatBufferBuilder builder, - long id, - int variablesOffset, - int nodesOffset, - int outputsOffset, - int configurationOffset, - int placeholdersOffset, - int lossVariablesOffset, - int trainingConfigOffset, - int updaterStateOffset) { - builder.startObject(9); - FlatGraph.addId(builder, id); - FlatGraph.addUpdaterState(builder, updaterStateOffset); - FlatGraph.addTrainingConfig(builder, trainingConfigOffset); - FlatGraph.addLossVariables(builder, lossVariablesOffset); - FlatGraph.addPlaceholders(builder, placeholdersOffset); - FlatGraph.addConfiguration(builder, configurationOffset); - FlatGraph.addOutputs(builder, outputsOffset); - FlatGraph.addNodes(builder, nodesOffset); - FlatGraph.addVariables(builder, variablesOffset); - return FlatGraph.endFlatGraph(builder); - } - - public static void startFlatGraph(FlatBufferBuilder builder) { builder.startObject(9); } - public static void addId(FlatBufferBuilder builder, long id) { builder.addLong(0, id, 0L); } - public static void addVariables(FlatBufferBuilder builder, int variablesOffset) { builder.addOffset(1, variablesOffset, 0); } - public static int createVariablesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startVariablesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addNodes(FlatBufferBuilder builder, int nodesOffset) { builder.addOffset(2, nodesOffset, 0); } - public static int createNodesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startNodesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addOutputs(FlatBufferBuilder builder, int outputsOffset) { builder.addOffset(3, outputsOffset, 0); } - public static int createOutputsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startOutputsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addConfiguration(FlatBufferBuilder builder, int configurationOffset) { builder.addOffset(4, configurationOffset, 0); } - public static void addPlaceholders(FlatBufferBuilder builder, int placeholdersOffset) { builder.addOffset(5, placeholdersOffset, 0); } - public static int createPlaceholdersVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startPlaceholdersVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addLossVariables(FlatBufferBuilder builder, int lossVariablesOffset) { builder.addOffset(6, lossVariablesOffset, 0); } - public static int createLossVariablesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startLossVariablesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addTrainingConfig(FlatBufferBuilder builder, int trainingConfigOffset) { builder.addOffset(7, trainingConfigOffset, 0); } - public static void addUpdaterState(FlatBufferBuilder builder, int updaterStateOffset) { builder.addOffset(8, updaterStateOffset, 0); } - public static int createUpdaterStateVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startUpdaterStateVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static int endFlatGraph(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } - public static void finishFlatGraphBuffer(FlatBufferBuilder builder, int offset) { builder.finish(offset); } - public static void finishSizePrefixedFlatGraphBuffer(FlatBufferBuilder builder, int offset) { builder.finishSizePrefixed(offset); } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.py b/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.py deleted file mode 100644 index 1a89489ab..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatGraph.py +++ /dev/null @@ -1,186 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatGraph(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatGraph(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatGraph() - x.Init(buf, n + offset) - return x - - # FlatGraph - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatGraph - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # FlatGraph - def Variables(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatVariable import FlatVariable - obj = FlatVariable() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatGraph - def VariablesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatGraph - def Nodes(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatNode import FlatNode - obj = FlatNode() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatGraph - def NodesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatGraph - def Outputs(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .IntPair import IntPair - obj = IntPair() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatGraph - def OutputsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatGraph - def Configuration(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatConfiguration import FlatConfiguration - obj = FlatConfiguration() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatGraph - def Placeholders(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatGraph - def PlaceholdersLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatGraph - def LossVariables(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatGraph - def LossVariablesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatGraph - def TrainingConfig(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # FlatGraph - def UpdaterState(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .UpdaterState import UpdaterState - obj = UpdaterState() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatGraph - def UpdaterStateLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - -def FlatGraphStart(builder): builder.StartObject(9) -def FlatGraphAddId(builder, id): builder.PrependInt64Slot(0, id, 0) -def FlatGraphAddVariables(builder, variables): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(variables), 0) -def FlatGraphStartVariablesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatGraphAddNodes(builder, nodes): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(nodes), 0) -def FlatGraphStartNodesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatGraphAddOutputs(builder, outputs): builder.PrependUOffsetTRelativeSlot(3, flatbuffers.number_types.UOffsetTFlags.py_type(outputs), 0) -def FlatGraphStartOutputsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatGraphAddConfiguration(builder, configuration): builder.PrependUOffsetTRelativeSlot(4, flatbuffers.number_types.UOffsetTFlags.py_type(configuration), 0) -def FlatGraphAddPlaceholders(builder, placeholders): builder.PrependUOffsetTRelativeSlot(5, flatbuffers.number_types.UOffsetTFlags.py_type(placeholders), 0) -def FlatGraphStartPlaceholdersVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatGraphAddLossVariables(builder, lossVariables): builder.PrependUOffsetTRelativeSlot(6, flatbuffers.number_types.UOffsetTFlags.py_type(lossVariables), 0) -def FlatGraphStartLossVariablesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatGraphAddTrainingConfig(builder, trainingConfig): builder.PrependUOffsetTRelativeSlot(7, flatbuffers.number_types.UOffsetTFlags.py_type(trainingConfig), 0) -def FlatGraphAddUpdaterState(builder, updaterState): builder.PrependUOffsetTRelativeSlot(8, flatbuffers.number_types.UOffsetTFlags.py_type(updaterState), 0) -def FlatGraphStartUpdaterStateVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatGraphEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.cs deleted file mode 100644 index 88e36dad5..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.cs +++ /dev/null @@ -1,52 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatInferenceRequest : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatInferenceRequest GetRootAsFlatInferenceRequest(ByteBuffer _bb) { return GetRootAsFlatInferenceRequest(_bb, new FlatInferenceRequest()); } - public static FlatInferenceRequest GetRootAsFlatInferenceRequest(ByteBuffer _bb, FlatInferenceRequest obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatInferenceRequest __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long Id { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public FlatVariable? Variables(int j) { int o = __p.__offset(6); return o != 0 ? (FlatVariable?)(new FlatVariable()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int VariablesLength { get { int o = __p.__offset(6); return o != 0 ? __p.__vector_len(o) : 0; } } - public FlatConfiguration? Configuration { get { int o = __p.__offset(8); return o != 0 ? (FlatConfiguration?)(new FlatConfiguration()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - - public static Offset CreateFlatInferenceRequest(FlatBufferBuilder builder, - long id = 0, - VectorOffset variablesOffset = default(VectorOffset), - Offset configurationOffset = default(Offset)) { - builder.StartObject(3); - FlatInferenceRequest.AddId(builder, id); - FlatInferenceRequest.AddConfiguration(builder, configurationOffset); - FlatInferenceRequest.AddVariables(builder, variablesOffset); - return FlatInferenceRequest.EndFlatInferenceRequest(builder); - } - - public static void StartFlatInferenceRequest(FlatBufferBuilder builder) { builder.StartObject(3); } - public static void AddId(FlatBufferBuilder builder, long id) { builder.AddLong(0, id, 0); } - public static void AddVariables(FlatBufferBuilder builder, VectorOffset variablesOffset) { builder.AddOffset(1, variablesOffset.Value, 0); } - public static VectorOffset CreateVariablesVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateVariablesVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartVariablesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddConfiguration(FlatBufferBuilder builder, Offset configurationOffset) { builder.AddOffset(2, configurationOffset.Value, 0); } - public static Offset EndFlatInferenceRequest(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } - public static void FinishFlatInferenceRequestBuffer(FlatBufferBuilder builder, Offset offset) { builder.Finish(offset.Value); } - public static void FinishSizePrefixedFlatInferenceRequestBuffer(FlatBufferBuilder builder, Offset offset) { builder.FinishSizePrefixed(offset.Value); } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.java b/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.java deleted file mode 100644 index fc907bcd8..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.java +++ /dev/null @@ -1,48 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatInferenceRequest extends Table { - public static FlatInferenceRequest getRootAsFlatInferenceRequest(ByteBuffer _bb) { return getRootAsFlatInferenceRequest(_bb, new FlatInferenceRequest()); } - public static FlatInferenceRequest getRootAsFlatInferenceRequest(ByteBuffer _bb, FlatInferenceRequest obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatInferenceRequest __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long id() { int o = __offset(4); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public FlatVariable variables(int j) { return variables(new FlatVariable(), j); } - public FlatVariable variables(FlatVariable obj, int j) { int o = __offset(6); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int variablesLength() { int o = __offset(6); return o != 0 ? __vector_len(o) : 0; } - public FlatConfiguration configuration() { return configuration(new FlatConfiguration()); } - public FlatConfiguration configuration(FlatConfiguration obj) { int o = __offset(8); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - - public static int createFlatInferenceRequest(FlatBufferBuilder builder, - long id, - int variablesOffset, - int configurationOffset) { - builder.startObject(3); - FlatInferenceRequest.addId(builder, id); - FlatInferenceRequest.addConfiguration(builder, configurationOffset); - FlatInferenceRequest.addVariables(builder, variablesOffset); - return FlatInferenceRequest.endFlatInferenceRequest(builder); - } - - public static void startFlatInferenceRequest(FlatBufferBuilder builder) { builder.startObject(3); } - public static void addId(FlatBufferBuilder builder, long id) { builder.addLong(0, id, 0L); } - public static void addVariables(FlatBufferBuilder builder, int variablesOffset) { builder.addOffset(1, variablesOffset, 0); } - public static int createVariablesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startVariablesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addConfiguration(FlatBufferBuilder builder, int configurationOffset) { builder.addOffset(2, configurationOffset, 0); } - public static int endFlatInferenceRequest(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } - public static void finishFlatInferenceRequestBuffer(FlatBufferBuilder builder, int offset) { builder.finish(offset); } - public static void finishSizePrefixedFlatInferenceRequestBuffer(FlatBufferBuilder builder, int offset) { builder.finishSizePrefixed(offset); } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.py b/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.py deleted file mode 100644 index 7bd30dbb6..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatInferenceRequest.py +++ /dev/null @@ -1,78 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatInferenceRequest(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatInferenceRequest(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatInferenceRequest() - x.Init(buf, n + offset) - return x - - # FlatInferenceRequest - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatInferenceRequest - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # FlatInferenceRequest - def Variables(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatVariable import FlatVariable - obj = FlatVariable() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatInferenceRequest - def VariablesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatInferenceRequest - def Configuration(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatConfiguration import FlatConfiguration - obj = FlatConfiguration() - obj.Init(self._tab.Bytes, x) - return obj - return None - -def FlatInferenceRequestStart(builder): builder.StartObject(3) -def FlatInferenceRequestAddId(builder, id): builder.PrependInt64Slot(0, id, 0) -def FlatInferenceRequestAddVariables(builder, variables): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(variables), 0) -def FlatInferenceRequestStartVariablesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatInferenceRequestAddConfiguration(builder, configuration): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(configuration), 0) -def FlatInferenceRequestEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatNode.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatNode.cs deleted file mode 100644 index 3f951c0e9..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatNode.cs +++ /dev/null @@ -1,250 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatNode : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatNode GetRootAsFlatNode(ByteBuffer _bb) { return GetRootAsFlatNode(_bb, new FlatNode()); } - public static FlatNode GetRootAsFlatNode(ByteBuffer _bb, FlatNode obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatNode __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int Id { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public string Name { get { int o = __p.__offset(6); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetNameBytes() { return __p.__vector_as_span(6); } -#else - public ArraySegment? GetNameBytes() { return __p.__vector_as_arraysegment(6); } -#endif - public byte[] GetNameArray() { return __p.__vector_as_array(6); } - public OpType OpType { get { int o = __p.__offset(8); return o != 0 ? (OpType)__p.bb.GetSbyte(o + __p.bb_pos) : OpType.TRANSFORM_FLOAT; } } - public long OpNum { get { int o = __p.__offset(10); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public FlatProperties? Properties(int j) { int o = __p.__offset(12); return o != 0 ? (FlatProperties?)(new FlatProperties()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int PropertiesLength { get { int o = __p.__offset(12); return o != 0 ? __p.__vector_len(o) : 0; } } - public int Input(int j) { int o = __p.__offset(14); return o != 0 ? __p.bb.GetInt(__p.__vector(o) + j * 4) : (int)0; } - public int InputLength { get { int o = __p.__offset(14); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetInputBytes() { return __p.__vector_as_span(14); } -#else - public ArraySegment? GetInputBytes() { return __p.__vector_as_arraysegment(14); } -#endif - public int[] GetInputArray() { return __p.__vector_as_array(14); } - public IntPair? InputPaired(int j) { int o = __p.__offset(16); return o != 0 ? (IntPair?)(new IntPair()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int InputPairedLength { get { int o = __p.__offset(16); return o != 0 ? __p.__vector_len(o) : 0; } } - public int Output(int j) { int o = __p.__offset(18); return o != 0 ? __p.bb.GetInt(__p.__vector(o) + j * 4) : (int)0; } - public int OutputLength { get { int o = __p.__offset(18); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetOutputBytes() { return __p.__vector_as_span(18); } -#else - public ArraySegment? GetOutputBytes() { return __p.__vector_as_arraysegment(18); } -#endif - public int[] GetOutputArray() { return __p.__vector_as_array(18); } - public double ExtraParams(int j) { int o = __p.__offset(20); return o != 0 ? __p.bb.GetDouble(__p.__vector(o) + j * 8) : (double)0; } - public int ExtraParamsLength { get { int o = __p.__offset(20); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetExtraParamsBytes() { return __p.__vector_as_span(20); } -#else - public ArraySegment? GetExtraParamsBytes() { return __p.__vector_as_arraysegment(20); } -#endif - public double[] GetExtraParamsArray() { return __p.__vector_as_array(20); } - public long ExtraInteger(int j) { int o = __p.__offset(22); return o != 0 ? __p.bb.GetLong(__p.__vector(o) + j * 8) : (long)0; } - public int ExtraIntegerLength { get { int o = __p.__offset(22); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetExtraIntegerBytes() { return __p.__vector_as_span(22); } -#else - public ArraySegment? GetExtraIntegerBytes() { return __p.__vector_as_arraysegment(22); } -#endif - public long[] GetExtraIntegerArray() { return __p.__vector_as_array(22); } - public bool ExtraBools(int j) { int o = __p.__offset(24); return o != 0 ? 0!=__p.bb.Get(__p.__vector(o) + j * 1) : false; } - public int ExtraBoolsLength { get { int o = __p.__offset(24); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetExtraBoolsBytes() { return __p.__vector_as_span(24); } -#else - public ArraySegment? GetExtraBoolsBytes() { return __p.__vector_as_arraysegment(24); } -#endif - public bool[] GetExtraBoolsArray() { return __p.__vector_as_array(24); } - public int Dimensions(int j) { int o = __p.__offset(26); return o != 0 ? __p.bb.GetInt(__p.__vector(o) + j * 4) : (int)0; } - public int DimensionsLength { get { int o = __p.__offset(26); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetDimensionsBytes() { return __p.__vector_as_span(26); } -#else - public ArraySegment? GetDimensionsBytes() { return __p.__vector_as_arraysegment(26); } -#endif - public int[] GetDimensionsArray() { return __p.__vector_as_array(26); } - public int Device { get { int o = __p.__offset(28); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public int ScopeId { get { int o = __p.__offset(30); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public string ScopeName { get { int o = __p.__offset(32); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetScopeNameBytes() { return __p.__vector_as_span(32); } -#else - public ArraySegment? GetScopeNameBytes() { return __p.__vector_as_arraysegment(32); } -#endif - public byte[] GetScopeNameArray() { return __p.__vector_as_array(32); } - public string OutputNames(int j) { int o = __p.__offset(34); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int OutputNamesLength { get { int o = __p.__offset(34); return o != 0 ? __p.__vector_len(o) : 0; } } - public string OpName { get { int o = __p.__offset(36); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetOpNameBytes() { return __p.__vector_as_span(36); } -#else - public ArraySegment? GetOpNameBytes() { return __p.__vector_as_arraysegment(36); } -#endif - public byte[] GetOpNameArray() { return __p.__vector_as_array(36); } - public DType OutputTypes(int j) { int o = __p.__offset(38); return o != 0 ? (DType)__p.bb.GetSbyte(__p.__vector(o) + j * 1) : (DType)0; } - public int OutputTypesLength { get { int o = __p.__offset(38); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetOutputTypesBytes() { return __p.__vector_as_span(38); } -#else - public ArraySegment? GetOutputTypesBytes() { return __p.__vector_as_arraysegment(38); } -#endif - public DType[] GetOutputTypesArray() { return __p.__vector_as_array(38); } - public FlatArray? Scalar { get { int o = __p.__offset(40); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public string ControlDeps(int j) { int o = __p.__offset(42); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepsLength { get { int o = __p.__offset(42); return o != 0 ? __p.__vector_len(o) : 0; } } - public string VarControlDeps(int j) { int o = __p.__offset(44); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int VarControlDepsLength { get { int o = __p.__offset(44); return o != 0 ? __p.__vector_len(o) : 0; } } - public string ControlDepFor(int j) { int o = __p.__offset(46); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepForLength { get { int o = __p.__offset(46); return o != 0 ? __p.__vector_len(o) : 0; } } - public DType ExtraTypes(int j) { int o = __p.__offset(48); return o != 0 ? (DType)__p.bb.GetSbyte(__p.__vector(o) + j * 1) : (DType)0; } - public int ExtraTypesLength { get { int o = __p.__offset(48); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetExtraTypesBytes() { return __p.__vector_as_span(48); } -#else - public ArraySegment? GetExtraTypesBytes() { return __p.__vector_as_arraysegment(48); } -#endif - public DType[] GetExtraTypesArray() { return __p.__vector_as_array(48); } - - public static Offset CreateFlatNode(FlatBufferBuilder builder, - int id = 0, - StringOffset nameOffset = default(StringOffset), - OpType opType = OpType.TRANSFORM_FLOAT, - long opNum = 0, - VectorOffset propertiesOffset = default(VectorOffset), - VectorOffset inputOffset = default(VectorOffset), - VectorOffset inputPairedOffset = default(VectorOffset), - VectorOffset outputOffset = default(VectorOffset), - VectorOffset extraParamsOffset = default(VectorOffset), - VectorOffset extraIntegerOffset = default(VectorOffset), - VectorOffset extraBoolsOffset = default(VectorOffset), - VectorOffset dimensionsOffset = default(VectorOffset), - int device = 0, - int scope_id = 0, - StringOffset scope_nameOffset = default(StringOffset), - VectorOffset outputNamesOffset = default(VectorOffset), - StringOffset opNameOffset = default(StringOffset), - VectorOffset outputTypesOffset = default(VectorOffset), - Offset scalarOffset = default(Offset), - VectorOffset controlDepsOffset = default(VectorOffset), - VectorOffset varControlDepsOffset = default(VectorOffset), - VectorOffset controlDepForOffset = default(VectorOffset), - VectorOffset extraTypesOffset = default(VectorOffset)) { - builder.StartObject(23); - FlatNode.AddOpNum(builder, opNum); - FlatNode.AddExtraTypes(builder, extraTypesOffset); - FlatNode.AddControlDepFor(builder, controlDepForOffset); - FlatNode.AddVarControlDeps(builder, varControlDepsOffset); - FlatNode.AddControlDeps(builder, controlDepsOffset); - FlatNode.AddScalar(builder, scalarOffset); - FlatNode.AddOutputTypes(builder, outputTypesOffset); - FlatNode.AddOpName(builder, opNameOffset); - FlatNode.AddOutputNames(builder, outputNamesOffset); - FlatNode.AddScopeName(builder, scope_nameOffset); - FlatNode.AddScopeId(builder, scope_id); - FlatNode.AddDevice(builder, device); - FlatNode.AddDimensions(builder, dimensionsOffset); - FlatNode.AddExtraBools(builder, extraBoolsOffset); - FlatNode.AddExtraInteger(builder, extraIntegerOffset); - FlatNode.AddExtraParams(builder, extraParamsOffset); - FlatNode.AddOutput(builder, outputOffset); - FlatNode.AddInputPaired(builder, inputPairedOffset); - FlatNode.AddInput(builder, inputOffset); - FlatNode.AddProperties(builder, propertiesOffset); - FlatNode.AddName(builder, nameOffset); - FlatNode.AddId(builder, id); - FlatNode.AddOpType(builder, opType); - return FlatNode.EndFlatNode(builder); - } - - public static void StartFlatNode(FlatBufferBuilder builder) { builder.StartObject(23); } - public static void AddId(FlatBufferBuilder builder, int id) { builder.AddInt(0, id, 0); } - public static void AddName(FlatBufferBuilder builder, StringOffset nameOffset) { builder.AddOffset(1, nameOffset.Value, 0); } - public static void AddOpType(FlatBufferBuilder builder, OpType opType) { builder.AddSbyte(2, (sbyte)opType, 0); } - public static void AddOpNum(FlatBufferBuilder builder, long opNum) { builder.AddLong(3, opNum, 0); } - public static void AddProperties(FlatBufferBuilder builder, VectorOffset propertiesOffset) { builder.AddOffset(4, propertiesOffset.Value, 0); } - public static VectorOffset CreatePropertiesVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreatePropertiesVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartPropertiesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddInput(FlatBufferBuilder builder, VectorOffset inputOffset) { builder.AddOffset(5, inputOffset.Value, 0); } - public static VectorOffset CreateInputVector(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddInt(data[i]); return builder.EndVector(); } - public static VectorOffset CreateInputVectorBlock(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartInputVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddInputPaired(FlatBufferBuilder builder, VectorOffset inputPairedOffset) { builder.AddOffset(6, inputPairedOffset.Value, 0); } - public static VectorOffset CreateInputPairedVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateInputPairedVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartInputPairedVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddOutput(FlatBufferBuilder builder, VectorOffset outputOffset) { builder.AddOffset(7, outputOffset.Value, 0); } - public static VectorOffset CreateOutputVector(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddInt(data[i]); return builder.EndVector(); } - public static VectorOffset CreateOutputVectorBlock(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartOutputVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddExtraParams(FlatBufferBuilder builder, VectorOffset extraParamsOffset) { builder.AddOffset(8, extraParamsOffset.Value, 0); } - public static VectorOffset CreateExtraParamsVector(FlatBufferBuilder builder, double[] data) { builder.StartVector(8, data.Length, 8); for (int i = data.Length - 1; i >= 0; i--) builder.AddDouble(data[i]); return builder.EndVector(); } - public static VectorOffset CreateExtraParamsVectorBlock(FlatBufferBuilder builder, double[] data) { builder.StartVector(8, data.Length, 8); builder.Add(data); return builder.EndVector(); } - public static void StartExtraParamsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(8, numElems, 8); } - public static void AddExtraInteger(FlatBufferBuilder builder, VectorOffset extraIntegerOffset) { builder.AddOffset(9, extraIntegerOffset.Value, 0); } - public static VectorOffset CreateExtraIntegerVector(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); for (int i = data.Length - 1; i >= 0; i--) builder.AddLong(data[i]); return builder.EndVector(); } - public static VectorOffset CreateExtraIntegerVectorBlock(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); builder.Add(data); return builder.EndVector(); } - public static void StartExtraIntegerVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(8, numElems, 8); } - public static void AddExtraBools(FlatBufferBuilder builder, VectorOffset extraBoolsOffset) { builder.AddOffset(10, extraBoolsOffset.Value, 0); } - public static VectorOffset CreateExtraBoolsVector(FlatBufferBuilder builder, bool[] data) { builder.StartVector(1, data.Length, 1); for (int i = data.Length - 1; i >= 0; i--) builder.AddBool(data[i]); return builder.EndVector(); } - public static VectorOffset CreateExtraBoolsVectorBlock(FlatBufferBuilder builder, bool[] data) { builder.StartVector(1, data.Length, 1); builder.Add(data); return builder.EndVector(); } - public static void StartExtraBoolsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(1, numElems, 1); } - public static void AddDimensions(FlatBufferBuilder builder, VectorOffset dimensionsOffset) { builder.AddOffset(11, dimensionsOffset.Value, 0); } - public static VectorOffset CreateDimensionsVector(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddInt(data[i]); return builder.EndVector(); } - public static VectorOffset CreateDimensionsVectorBlock(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartDimensionsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddDevice(FlatBufferBuilder builder, int device) { builder.AddInt(12, device, 0); } - public static void AddScopeId(FlatBufferBuilder builder, int scopeId) { builder.AddInt(13, scopeId, 0); } - public static void AddScopeName(FlatBufferBuilder builder, StringOffset scopeNameOffset) { builder.AddOffset(14, scopeNameOffset.Value, 0); } - public static void AddOutputNames(FlatBufferBuilder builder, VectorOffset outputNamesOffset) { builder.AddOffset(15, outputNamesOffset.Value, 0); } - public static VectorOffset CreateOutputNamesVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateOutputNamesVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartOutputNamesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddOpName(FlatBufferBuilder builder, StringOffset opNameOffset) { builder.AddOffset(16, opNameOffset.Value, 0); } - public static void AddOutputTypes(FlatBufferBuilder builder, VectorOffset outputTypesOffset) { builder.AddOffset(17, outputTypesOffset.Value, 0); } - public static VectorOffset CreateOutputTypesVector(FlatBufferBuilder builder, DType[] data) { builder.StartVector(1, data.Length, 1); for (int i = data.Length - 1; i >= 0; i--) builder.AddSbyte((sbyte)data[i]); return builder.EndVector(); } - public static VectorOffset CreateOutputTypesVectorBlock(FlatBufferBuilder builder, DType[] data) { builder.StartVector(1, data.Length, 1); builder.Add(data); return builder.EndVector(); } - public static void StartOutputTypesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(1, numElems, 1); } - public static void AddScalar(FlatBufferBuilder builder, Offset scalarOffset) { builder.AddOffset(18, scalarOffset.Value, 0); } - public static void AddControlDeps(FlatBufferBuilder builder, VectorOffset controlDepsOffset) { builder.AddOffset(19, controlDepsOffset.Value, 0); } - public static VectorOffset CreateControlDepsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddVarControlDeps(FlatBufferBuilder builder, VectorOffset varControlDepsOffset) { builder.AddOffset(20, varControlDepsOffset.Value, 0); } - public static VectorOffset CreateVarControlDepsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateVarControlDepsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartVarControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddControlDepFor(FlatBufferBuilder builder, VectorOffset controlDepForOffset) { builder.AddOffset(21, controlDepForOffset.Value, 0); } - public static VectorOffset CreateControlDepForVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepForVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepForVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddExtraTypes(FlatBufferBuilder builder, VectorOffset extraTypesOffset) { builder.AddOffset(22, extraTypesOffset.Value, 0); } - public static VectorOffset CreateExtraTypesVector(FlatBufferBuilder builder, DType[] data) { builder.StartVector(1, data.Length, 1); for (int i = data.Length - 1; i >= 0; i--) builder.AddSbyte((sbyte)data[i]); return builder.EndVector(); } - public static VectorOffset CreateExtraTypesVectorBlock(FlatBufferBuilder builder, DType[] data) { builder.StartVector(1, data.Length, 1); builder.Add(data); return builder.EndVector(); } - public static void StartExtraTypesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(1, numElems, 1); } - public static Offset EndFlatNode(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } - public static void FinishFlatNodeBuffer(FlatBufferBuilder builder, Offset offset) { builder.Finish(offset.Value); } - public static void FinishSizePrefixedFlatNodeBuffer(FlatBufferBuilder builder, Offset offset) { builder.FinishSizePrefixed(offset.Value); } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatNode.java b/libnd4j/include/graph/generated/nd4j/graph/FlatNode.java deleted file mode 100644 index 2fe0a0ee9..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatNode.java +++ /dev/null @@ -1,190 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatNode extends Table { - public static FlatNode getRootAsFlatNode(ByteBuffer _bb) { return getRootAsFlatNode(_bb, new FlatNode()); } - public static FlatNode getRootAsFlatNode(ByteBuffer _bb, FlatNode obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatNode __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int id() { int o = __offset(4); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public String name() { int o = __offset(6); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer nameAsByteBuffer() { return __vector_as_bytebuffer(6, 1); } - public ByteBuffer nameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 6, 1); } - public byte opType() { int o = __offset(8); return o != 0 ? bb.get(o + bb_pos) : 0; } - public long opNum() { int o = __offset(10); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public FlatProperties properties(int j) { return properties(new FlatProperties(), j); } - public FlatProperties properties(FlatProperties obj, int j) { int o = __offset(12); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int propertiesLength() { int o = __offset(12); return o != 0 ? __vector_len(o) : 0; } - public int input(int j) { int o = __offset(14); return o != 0 ? bb.getInt(__vector(o) + j * 4) : 0; } - public int inputLength() { int o = __offset(14); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer inputAsByteBuffer() { return __vector_as_bytebuffer(14, 4); } - public ByteBuffer inputInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 14, 4); } - public IntPair inputPaired(int j) { return inputPaired(new IntPair(), j); } - public IntPair inputPaired(IntPair obj, int j) { int o = __offset(16); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int inputPairedLength() { int o = __offset(16); return o != 0 ? __vector_len(o) : 0; } - public int output(int j) { int o = __offset(18); return o != 0 ? bb.getInt(__vector(o) + j * 4) : 0; } - public int outputLength() { int o = __offset(18); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer outputAsByteBuffer() { return __vector_as_bytebuffer(18, 4); } - public ByteBuffer outputInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 18, 4); } - public double extraParams(int j) { int o = __offset(20); return o != 0 ? bb.getDouble(__vector(o) + j * 8) : 0; } - public int extraParamsLength() { int o = __offset(20); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer extraParamsAsByteBuffer() { return __vector_as_bytebuffer(20, 8); } - public ByteBuffer extraParamsInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 20, 8); } - public long extraInteger(int j) { int o = __offset(22); return o != 0 ? bb.getLong(__vector(o) + j * 8) : 0; } - public int extraIntegerLength() { int o = __offset(22); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer extraIntegerAsByteBuffer() { return __vector_as_bytebuffer(22, 8); } - public ByteBuffer extraIntegerInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 22, 8); } - public boolean extraBools(int j) { int o = __offset(24); return o != 0 ? 0!=bb.get(__vector(o) + j * 1) : false; } - public int extraBoolsLength() { int o = __offset(24); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer extraBoolsAsByteBuffer() { return __vector_as_bytebuffer(24, 1); } - public ByteBuffer extraBoolsInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 24, 1); } - public int dimensions(int j) { int o = __offset(26); return o != 0 ? bb.getInt(__vector(o) + j * 4) : 0; } - public int dimensionsLength() { int o = __offset(26); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer dimensionsAsByteBuffer() { return __vector_as_bytebuffer(26, 4); } - public ByteBuffer dimensionsInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 26, 4); } - public int device() { int o = __offset(28); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public int scopeId() { int o = __offset(30); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public String scopeName() { int o = __offset(32); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer scopeNameAsByteBuffer() { return __vector_as_bytebuffer(32, 1); } - public ByteBuffer scopeNameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 32, 1); } - public String outputNames(int j) { int o = __offset(34); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int outputNamesLength() { int o = __offset(34); return o != 0 ? __vector_len(o) : 0; } - public String opName() { int o = __offset(36); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer opNameAsByteBuffer() { return __vector_as_bytebuffer(36, 1); } - public ByteBuffer opNameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 36, 1); } - public byte outputTypes(int j) { int o = __offset(38); return o != 0 ? bb.get(__vector(o) + j * 1) : 0; } - public int outputTypesLength() { int o = __offset(38); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer outputTypesAsByteBuffer() { return __vector_as_bytebuffer(38, 1); } - public ByteBuffer outputTypesInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 38, 1); } - public FlatArray scalar() { return scalar(new FlatArray()); } - public FlatArray scalar(FlatArray obj) { int o = __offset(40); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public String controlDeps(int j) { int o = __offset(42); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepsLength() { int o = __offset(42); return o != 0 ? __vector_len(o) : 0; } - public String varControlDeps(int j) { int o = __offset(44); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int varControlDepsLength() { int o = __offset(44); return o != 0 ? __vector_len(o) : 0; } - public String controlDepFor(int j) { int o = __offset(46); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepForLength() { int o = __offset(46); return o != 0 ? __vector_len(o) : 0; } - public byte extraTypes(int j) { int o = __offset(48); return o != 0 ? bb.get(__vector(o) + j * 1) : 0; } - public int extraTypesLength() { int o = __offset(48); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer extraTypesAsByteBuffer() { return __vector_as_bytebuffer(48, 1); } - public ByteBuffer extraTypesInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 48, 1); } - - public static int createFlatNode(FlatBufferBuilder builder, - int id, - int nameOffset, - byte opType, - long opNum, - int propertiesOffset, - int inputOffset, - int inputPairedOffset, - int outputOffset, - int extraParamsOffset, - int extraIntegerOffset, - int extraBoolsOffset, - int dimensionsOffset, - int device, - int scope_id, - int scope_nameOffset, - int outputNamesOffset, - int opNameOffset, - int outputTypesOffset, - int scalarOffset, - int controlDepsOffset, - int varControlDepsOffset, - int controlDepForOffset, - int extraTypesOffset) { - builder.startObject(23); - FlatNode.addOpNum(builder, opNum); - FlatNode.addExtraTypes(builder, extraTypesOffset); - FlatNode.addControlDepFor(builder, controlDepForOffset); - FlatNode.addVarControlDeps(builder, varControlDepsOffset); - FlatNode.addControlDeps(builder, controlDepsOffset); - FlatNode.addScalar(builder, scalarOffset); - FlatNode.addOutputTypes(builder, outputTypesOffset); - FlatNode.addOpName(builder, opNameOffset); - FlatNode.addOutputNames(builder, outputNamesOffset); - FlatNode.addScopeName(builder, scope_nameOffset); - FlatNode.addScopeId(builder, scope_id); - FlatNode.addDevice(builder, device); - FlatNode.addDimensions(builder, dimensionsOffset); - FlatNode.addExtraBools(builder, extraBoolsOffset); - FlatNode.addExtraInteger(builder, extraIntegerOffset); - FlatNode.addExtraParams(builder, extraParamsOffset); - FlatNode.addOutput(builder, outputOffset); - FlatNode.addInputPaired(builder, inputPairedOffset); - FlatNode.addInput(builder, inputOffset); - FlatNode.addProperties(builder, propertiesOffset); - FlatNode.addName(builder, nameOffset); - FlatNode.addId(builder, id); - FlatNode.addOpType(builder, opType); - return FlatNode.endFlatNode(builder); - } - - public static void startFlatNode(FlatBufferBuilder builder) { builder.startObject(23); } - public static void addId(FlatBufferBuilder builder, int id) { builder.addInt(0, id, 0); } - public static void addName(FlatBufferBuilder builder, int nameOffset) { builder.addOffset(1, nameOffset, 0); } - public static void addOpType(FlatBufferBuilder builder, byte opType) { builder.addByte(2, opType, 0); } - public static void addOpNum(FlatBufferBuilder builder, long opNum) { builder.addLong(3, opNum, 0L); } - public static void addProperties(FlatBufferBuilder builder, int propertiesOffset) { builder.addOffset(4, propertiesOffset, 0); } - public static int createPropertiesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startPropertiesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addInput(FlatBufferBuilder builder, int inputOffset) { builder.addOffset(5, inputOffset, 0); } - public static int createInputVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addInt(data[i]); return builder.endVector(); } - public static void startInputVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addInputPaired(FlatBufferBuilder builder, int inputPairedOffset) { builder.addOffset(6, inputPairedOffset, 0); } - public static int createInputPairedVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startInputPairedVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addOutput(FlatBufferBuilder builder, int outputOffset) { builder.addOffset(7, outputOffset, 0); } - public static int createOutputVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addInt(data[i]); return builder.endVector(); } - public static void startOutputVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addExtraParams(FlatBufferBuilder builder, int extraParamsOffset) { builder.addOffset(8, extraParamsOffset, 0); } - public static int createExtraParamsVector(FlatBufferBuilder builder, double[] data) { builder.startVector(8, data.length, 8); for (int i = data.length - 1; i >= 0; i--) builder.addDouble(data[i]); return builder.endVector(); } - public static void startExtraParamsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(8, numElems, 8); } - public static void addExtraInteger(FlatBufferBuilder builder, int extraIntegerOffset) { builder.addOffset(9, extraIntegerOffset, 0); } - public static int createExtraIntegerVector(FlatBufferBuilder builder, long[] data) { builder.startVector(8, data.length, 8); for (int i = data.length - 1; i >= 0; i--) builder.addLong(data[i]); return builder.endVector(); } - public static void startExtraIntegerVector(FlatBufferBuilder builder, int numElems) { builder.startVector(8, numElems, 8); } - public static void addExtraBools(FlatBufferBuilder builder, int extraBoolsOffset) { builder.addOffset(10, extraBoolsOffset, 0); } - public static int createExtraBoolsVector(FlatBufferBuilder builder, boolean[] data) { builder.startVector(1, data.length, 1); for (int i = data.length - 1; i >= 0; i--) builder.addBoolean(data[i]); return builder.endVector(); } - public static void startExtraBoolsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(1, numElems, 1); } - public static void addDimensions(FlatBufferBuilder builder, int dimensionsOffset) { builder.addOffset(11, dimensionsOffset, 0); } - public static int createDimensionsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addInt(data[i]); return builder.endVector(); } - public static void startDimensionsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addDevice(FlatBufferBuilder builder, int device) { builder.addInt(12, device, 0); } - public static void addScopeId(FlatBufferBuilder builder, int scopeId) { builder.addInt(13, scopeId, 0); } - public static void addScopeName(FlatBufferBuilder builder, int scopeNameOffset) { builder.addOffset(14, scopeNameOffset, 0); } - public static void addOutputNames(FlatBufferBuilder builder, int outputNamesOffset) { builder.addOffset(15, outputNamesOffset, 0); } - public static int createOutputNamesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startOutputNamesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addOpName(FlatBufferBuilder builder, int opNameOffset) { builder.addOffset(16, opNameOffset, 0); } - public static void addOutputTypes(FlatBufferBuilder builder, int outputTypesOffset) { builder.addOffset(17, outputTypesOffset, 0); } - public static int createOutputTypesVector(FlatBufferBuilder builder, byte[] data) { builder.startVector(1, data.length, 1); for (int i = data.length - 1; i >= 0; i--) builder.addByte(data[i]); return builder.endVector(); } - public static void startOutputTypesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(1, numElems, 1); } - public static void addScalar(FlatBufferBuilder builder, int scalarOffset) { builder.addOffset(18, scalarOffset, 0); } - public static void addControlDeps(FlatBufferBuilder builder, int controlDepsOffset) { builder.addOffset(19, controlDepsOffset, 0); } - public static int createControlDepsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addVarControlDeps(FlatBufferBuilder builder, int varControlDepsOffset) { builder.addOffset(20, varControlDepsOffset, 0); } - public static int createVarControlDepsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startVarControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addControlDepFor(FlatBufferBuilder builder, int controlDepForOffset) { builder.addOffset(21, controlDepForOffset, 0); } - public static int createControlDepForVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepForVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addExtraTypes(FlatBufferBuilder builder, int extraTypesOffset) { builder.addOffset(22, extraTypesOffset, 0); } - public static int createExtraTypesVector(FlatBufferBuilder builder, byte[] data) { builder.startVector(1, data.length, 1); for (int i = data.length - 1; i >= 0; i--) builder.addByte(data[i]); return builder.endVector(); } - public static void startExtraTypesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(1, numElems, 1); } - public static int endFlatNode(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } - public static void finishFlatNodeBuffer(FlatBufferBuilder builder, int offset) { builder.finish(offset); } - public static void finishSizePrefixedFlatNodeBuffer(FlatBufferBuilder builder, int offset) { builder.finishSizePrefixed(offset); } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatNode.py b/libnd4j/include/graph/generated/nd4j/graph/FlatNode.py deleted file mode 100644 index 61b80e28e..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatNode.py +++ /dev/null @@ -1,416 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatNode(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatNode(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatNode() - x.Init(buf, n + offset) - return x - - # FlatNode - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatNode - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # FlatNode - def Name(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # FlatNode - def OpType(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # FlatNode - def OpNum(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # FlatNode - def Properties(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatProperties import FlatProperties - obj = FlatProperties() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatNode - def PropertiesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def Input(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int32Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return 0 - - # FlatNode - def InputAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int32Flags, o) - return 0 - - # FlatNode - def InputLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def InputPaired(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .IntPair import IntPair - obj = IntPair() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatNode - def InputPairedLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def Output(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int32Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return 0 - - # FlatNode - def OutputAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int32Flags, o) - return 0 - - # FlatNode - def OutputLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def ExtraParams(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Float64Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 8)) - return 0 - - # FlatNode - def ExtraParamsAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Float64Flags, o) - return 0 - - # FlatNode - def ExtraParamsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def ExtraInteger(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int64Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 8)) - return 0 - - # FlatNode - def ExtraIntegerAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int64Flags, o) - return 0 - - # FlatNode - def ExtraIntegerLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def ExtraBools(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(24)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.BoolFlags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 1)) - return 0 - - # FlatNode - def ExtraBoolsAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(24)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.BoolFlags, o) - return 0 - - # FlatNode - def ExtraBoolsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(24)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def Dimensions(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(26)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int32Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return 0 - - # FlatNode - def DimensionsAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(26)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int32Flags, o) - return 0 - - # FlatNode - def DimensionsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(26)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def Device(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(28)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # FlatNode - def ScopeId(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(30)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # FlatNode - def ScopeName(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(32)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # FlatNode - def OutputNames(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(34)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatNode - def OutputNamesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(34)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def OpName(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(36)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # FlatNode - def OutputTypes(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(38)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int8Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 1)) - return 0 - - # FlatNode - def OutputTypesAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(38)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int8Flags, o) - return 0 - - # FlatNode - def OutputTypesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(38)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def Scalar(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(40)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatNode - def ControlDeps(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(42)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatNode - def ControlDepsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(42)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def VarControlDeps(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(44)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatNode - def VarControlDepsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(44)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def ControlDepFor(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(46)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatNode - def ControlDepForLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(46)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatNode - def ExtraTypes(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(48)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int8Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 1)) - return 0 - - # FlatNode - def ExtraTypesAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(48)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int8Flags, o) - return 0 - - # FlatNode - def ExtraTypesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(48)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - -def FlatNodeStart(builder): builder.StartObject(23) -def FlatNodeAddId(builder, id): builder.PrependInt32Slot(0, id, 0) -def FlatNodeAddName(builder, name): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(name), 0) -def FlatNodeAddOpType(builder, opType): builder.PrependInt8Slot(2, opType, 0) -def FlatNodeAddOpNum(builder, opNum): builder.PrependInt64Slot(3, opNum, 0) -def FlatNodeAddProperties(builder, properties): builder.PrependUOffsetTRelativeSlot(4, flatbuffers.number_types.UOffsetTFlags.py_type(properties), 0) -def FlatNodeStartPropertiesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddInput(builder, input): builder.PrependUOffsetTRelativeSlot(5, flatbuffers.number_types.UOffsetTFlags.py_type(input), 0) -def FlatNodeStartInputVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddInputPaired(builder, inputPaired): builder.PrependUOffsetTRelativeSlot(6, flatbuffers.number_types.UOffsetTFlags.py_type(inputPaired), 0) -def FlatNodeStartInputPairedVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddOutput(builder, output): builder.PrependUOffsetTRelativeSlot(7, flatbuffers.number_types.UOffsetTFlags.py_type(output), 0) -def FlatNodeStartOutputVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddExtraParams(builder, extraParams): builder.PrependUOffsetTRelativeSlot(8, flatbuffers.number_types.UOffsetTFlags.py_type(extraParams), 0) -def FlatNodeStartExtraParamsVector(builder, numElems): return builder.StartVector(8, numElems, 8) -def FlatNodeAddExtraInteger(builder, extraInteger): builder.PrependUOffsetTRelativeSlot(9, flatbuffers.number_types.UOffsetTFlags.py_type(extraInteger), 0) -def FlatNodeStartExtraIntegerVector(builder, numElems): return builder.StartVector(8, numElems, 8) -def FlatNodeAddExtraBools(builder, extraBools): builder.PrependUOffsetTRelativeSlot(10, flatbuffers.number_types.UOffsetTFlags.py_type(extraBools), 0) -def FlatNodeStartExtraBoolsVector(builder, numElems): return builder.StartVector(1, numElems, 1) -def FlatNodeAddDimensions(builder, dimensions): builder.PrependUOffsetTRelativeSlot(11, flatbuffers.number_types.UOffsetTFlags.py_type(dimensions), 0) -def FlatNodeStartDimensionsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddDevice(builder, device): builder.PrependInt32Slot(12, device, 0) -def FlatNodeAddScopeId(builder, scopeId): builder.PrependInt32Slot(13, scopeId, 0) -def FlatNodeAddScopeName(builder, scopeName): builder.PrependUOffsetTRelativeSlot(14, flatbuffers.number_types.UOffsetTFlags.py_type(scopeName), 0) -def FlatNodeAddOutputNames(builder, outputNames): builder.PrependUOffsetTRelativeSlot(15, flatbuffers.number_types.UOffsetTFlags.py_type(outputNames), 0) -def FlatNodeStartOutputNamesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddOpName(builder, opName): builder.PrependUOffsetTRelativeSlot(16, flatbuffers.number_types.UOffsetTFlags.py_type(opName), 0) -def FlatNodeAddOutputTypes(builder, outputTypes): builder.PrependUOffsetTRelativeSlot(17, flatbuffers.number_types.UOffsetTFlags.py_type(outputTypes), 0) -def FlatNodeStartOutputTypesVector(builder, numElems): return builder.StartVector(1, numElems, 1) -def FlatNodeAddScalar(builder, scalar): builder.PrependUOffsetTRelativeSlot(18, flatbuffers.number_types.UOffsetTFlags.py_type(scalar), 0) -def FlatNodeAddControlDeps(builder, controlDeps): builder.PrependUOffsetTRelativeSlot(19, flatbuffers.number_types.UOffsetTFlags.py_type(controlDeps), 0) -def FlatNodeStartControlDepsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddVarControlDeps(builder, varControlDeps): builder.PrependUOffsetTRelativeSlot(20, flatbuffers.number_types.UOffsetTFlags.py_type(varControlDeps), 0) -def FlatNodeStartVarControlDepsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddControlDepFor(builder, controlDepFor): builder.PrependUOffsetTRelativeSlot(21, flatbuffers.number_types.UOffsetTFlags.py_type(controlDepFor), 0) -def FlatNodeStartControlDepForVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatNodeAddExtraTypes(builder, extraTypes): builder.PrependUOffsetTRelativeSlot(22, flatbuffers.number_types.UOffsetTFlags.py_type(extraTypes), 0) -def FlatNodeStartExtraTypesVector(builder, numElems): return builder.StartVector(1, numElems, 1) -def FlatNodeEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.cs deleted file mode 100644 index 471dd9f3d..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.cs +++ /dev/null @@ -1,132 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatProperties : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatProperties GetRootAsFlatProperties(ByteBuffer _bb) { return GetRootAsFlatProperties(_bb, new FlatProperties()); } - public static FlatProperties GetRootAsFlatProperties(ByteBuffer _bb, FlatProperties obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatProperties __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public string Name { get { int o = __p.__offset(4); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetNameBytes() { return __p.__vector_as_span(4); } -#else - public ArraySegment? GetNameBytes() { return __p.__vector_as_arraysegment(4); } -#endif - public byte[] GetNameArray() { return __p.__vector_as_array(4); } - public int I(int j) { int o = __p.__offset(6); return o != 0 ? __p.bb.GetInt(__p.__vector(o) + j * 4) : (int)0; } - public int ILength { get { int o = __p.__offset(6); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetIBytes() { return __p.__vector_as_span(6); } -#else - public ArraySegment? GetIBytes() { return __p.__vector_as_arraysegment(6); } -#endif - public int[] GetIArray() { return __p.__vector_as_array(6); } - public long L(int j) { int o = __p.__offset(8); return o != 0 ? __p.bb.GetLong(__p.__vector(o) + j * 8) : (long)0; } - public int LLength { get { int o = __p.__offset(8); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetLBytes() { return __p.__vector_as_span(8); } -#else - public ArraySegment? GetLBytes() { return __p.__vector_as_arraysegment(8); } -#endif - public long[] GetLArray() { return __p.__vector_as_array(8); } - public double D(int j) { int o = __p.__offset(10); return o != 0 ? __p.bb.GetDouble(__p.__vector(o) + j * 8) : (double)0; } - public int DLength { get { int o = __p.__offset(10); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetDBytes() { return __p.__vector_as_span(10); } -#else - public ArraySegment? GetDBytes() { return __p.__vector_as_arraysegment(10); } -#endif - public double[] GetDArray() { return __p.__vector_as_array(10); } - public FlatArray? A(int j) { int o = __p.__offset(12); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int ALength { get { int o = __p.__offset(12); return o != 0 ? __p.__vector_len(o) : 0; } } - public bool B(int j) { int o = __p.__offset(14); return o != 0 ? 0!=__p.bb.Get(__p.__vector(o) + j * 1) : false; } - public int BLength { get { int o = __p.__offset(14); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetBBytes() { return __p.__vector_as_span(14); } -#else - public ArraySegment? GetBBytes() { return __p.__vector_as_arraysegment(14); } -#endif - public bool[] GetBArray() { return __p.__vector_as_array(14); } - public string S(int j) { int o = __p.__offset(16); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int SLength { get { int o = __p.__offset(16); return o != 0 ? __p.__vector_len(o) : 0; } } - public int Shape(int j) { int o = __p.__offset(18); return o != 0 ? __p.bb.GetInt(__p.__vector(o) + j * 4) : (int)0; } - public int ShapeLength { get { int o = __p.__offset(18); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetShapeBytes() { return __p.__vector_as_span(18); } -#else - public ArraySegment? GetShapeBytes() { return __p.__vector_as_arraysegment(18); } -#endif - public int[] GetShapeArray() { return __p.__vector_as_array(18); } - - public static Offset CreateFlatProperties(FlatBufferBuilder builder, - StringOffset nameOffset = default(StringOffset), - VectorOffset iOffset = default(VectorOffset), - VectorOffset lOffset = default(VectorOffset), - VectorOffset dOffset = default(VectorOffset), - VectorOffset aOffset = default(VectorOffset), - VectorOffset bOffset = default(VectorOffset), - VectorOffset sOffset = default(VectorOffset), - VectorOffset shapeOffset = default(VectorOffset)) { - builder.StartObject(8); - FlatProperties.AddShape(builder, shapeOffset); - FlatProperties.AddS(builder, sOffset); - FlatProperties.AddB(builder, bOffset); - FlatProperties.AddA(builder, aOffset); - FlatProperties.AddD(builder, dOffset); - FlatProperties.AddL(builder, lOffset); - FlatProperties.AddI(builder, iOffset); - FlatProperties.AddName(builder, nameOffset); - return FlatProperties.EndFlatProperties(builder); - } - - public static void StartFlatProperties(FlatBufferBuilder builder) { builder.StartObject(8); } - public static void AddName(FlatBufferBuilder builder, StringOffset nameOffset) { builder.AddOffset(0, nameOffset.Value, 0); } - public static void AddI(FlatBufferBuilder builder, VectorOffset iOffset) { builder.AddOffset(1, iOffset.Value, 0); } - public static VectorOffset CreateIVector(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddInt(data[i]); return builder.EndVector(); } - public static VectorOffset CreateIVectorBlock(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartIVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddL(FlatBufferBuilder builder, VectorOffset lOffset) { builder.AddOffset(2, lOffset.Value, 0); } - public static VectorOffset CreateLVector(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); for (int i = data.Length - 1; i >= 0; i--) builder.AddLong(data[i]); return builder.EndVector(); } - public static VectorOffset CreateLVectorBlock(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); builder.Add(data); return builder.EndVector(); } - public static void StartLVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(8, numElems, 8); } - public static void AddD(FlatBufferBuilder builder, VectorOffset dOffset) { builder.AddOffset(3, dOffset.Value, 0); } - public static VectorOffset CreateDVector(FlatBufferBuilder builder, double[] data) { builder.StartVector(8, data.Length, 8); for (int i = data.Length - 1; i >= 0; i--) builder.AddDouble(data[i]); return builder.EndVector(); } - public static VectorOffset CreateDVectorBlock(FlatBufferBuilder builder, double[] data) { builder.StartVector(8, data.Length, 8); builder.Add(data); return builder.EndVector(); } - public static void StartDVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(8, numElems, 8); } - public static void AddA(FlatBufferBuilder builder, VectorOffset aOffset) { builder.AddOffset(4, aOffset.Value, 0); } - public static VectorOffset CreateAVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateAVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartAVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddB(FlatBufferBuilder builder, VectorOffset bOffset) { builder.AddOffset(5, bOffset.Value, 0); } - public static VectorOffset CreateBVector(FlatBufferBuilder builder, bool[] data) { builder.StartVector(1, data.Length, 1); for (int i = data.Length - 1; i >= 0; i--) builder.AddBool(data[i]); return builder.EndVector(); } - public static VectorOffset CreateBVectorBlock(FlatBufferBuilder builder, bool[] data) { builder.StartVector(1, data.Length, 1); builder.Add(data); return builder.EndVector(); } - public static void StartBVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(1, numElems, 1); } - public static void AddS(FlatBufferBuilder builder, VectorOffset sOffset) { builder.AddOffset(6, sOffset.Value, 0); } - public static VectorOffset CreateSVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateSVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartSVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddShape(FlatBufferBuilder builder, VectorOffset shapeOffset) { builder.AddOffset(7, shapeOffset.Value, 0); } - public static VectorOffset CreateShapeVector(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddInt(data[i]); return builder.EndVector(); } - public static VectorOffset CreateShapeVectorBlock(FlatBufferBuilder builder, int[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartShapeVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static Offset EndFlatProperties(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } - public static void FinishFlatPropertiesBuffer(FlatBufferBuilder builder, Offset offset) { builder.Finish(offset.Value); } - public static void FinishSizePrefixedFlatPropertiesBuffer(FlatBufferBuilder builder, Offset offset) { builder.FinishSizePrefixed(offset.Value); } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.java b/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.java deleted file mode 100644 index 72df15cde..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.java +++ /dev/null @@ -1,97 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatProperties extends Table { - public static FlatProperties getRootAsFlatProperties(ByteBuffer _bb) { return getRootAsFlatProperties(_bb, new FlatProperties()); } - public static FlatProperties getRootAsFlatProperties(ByteBuffer _bb, FlatProperties obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatProperties __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public String name() { int o = __offset(4); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer nameAsByteBuffer() { return __vector_as_bytebuffer(4, 1); } - public ByteBuffer nameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 4, 1); } - public int i(int j) { int o = __offset(6); return o != 0 ? bb.getInt(__vector(o) + j * 4) : 0; } - public int iLength() { int o = __offset(6); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer iAsByteBuffer() { return __vector_as_bytebuffer(6, 4); } - public ByteBuffer iInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 6, 4); } - public long l(int j) { int o = __offset(8); return o != 0 ? bb.getLong(__vector(o) + j * 8) : 0; } - public int lLength() { int o = __offset(8); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer lAsByteBuffer() { return __vector_as_bytebuffer(8, 8); } - public ByteBuffer lInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 8, 8); } - public double d(int j) { int o = __offset(10); return o != 0 ? bb.getDouble(__vector(o) + j * 8) : 0; } - public int dLength() { int o = __offset(10); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer dAsByteBuffer() { return __vector_as_bytebuffer(10, 8); } - public ByteBuffer dInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 10, 8); } - public FlatArray a(int j) { return a(new FlatArray(), j); } - public FlatArray a(FlatArray obj, int j) { int o = __offset(12); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int aLength() { int o = __offset(12); return o != 0 ? __vector_len(o) : 0; } - public boolean b(int j) { int o = __offset(14); return o != 0 ? 0!=bb.get(__vector(o) + j * 1) : false; } - public int bLength() { int o = __offset(14); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer bAsByteBuffer() { return __vector_as_bytebuffer(14, 1); } - public ByteBuffer bInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 14, 1); } - public String s(int j) { int o = __offset(16); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int sLength() { int o = __offset(16); return o != 0 ? __vector_len(o) : 0; } - public int shape(int j) { int o = __offset(18); return o != 0 ? bb.getInt(__vector(o) + j * 4) : 0; } - public int shapeLength() { int o = __offset(18); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer shapeAsByteBuffer() { return __vector_as_bytebuffer(18, 4); } - public ByteBuffer shapeInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 18, 4); } - - public static int createFlatProperties(FlatBufferBuilder builder, - int nameOffset, - int iOffset, - int lOffset, - int dOffset, - int aOffset, - int bOffset, - int sOffset, - int shapeOffset) { - builder.startObject(8); - FlatProperties.addShape(builder, shapeOffset); - FlatProperties.addS(builder, sOffset); - FlatProperties.addB(builder, bOffset); - FlatProperties.addA(builder, aOffset); - FlatProperties.addD(builder, dOffset); - FlatProperties.addL(builder, lOffset); - FlatProperties.addI(builder, iOffset); - FlatProperties.addName(builder, nameOffset); - return FlatProperties.endFlatProperties(builder); - } - - public static void startFlatProperties(FlatBufferBuilder builder) { builder.startObject(8); } - public static void addName(FlatBufferBuilder builder, int nameOffset) { builder.addOffset(0, nameOffset, 0); } - public static void addI(FlatBufferBuilder builder, int iOffset) { builder.addOffset(1, iOffset, 0); } - public static int createIVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addInt(data[i]); return builder.endVector(); } - public static void startIVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addL(FlatBufferBuilder builder, int lOffset) { builder.addOffset(2, lOffset, 0); } - public static int createLVector(FlatBufferBuilder builder, long[] data) { builder.startVector(8, data.length, 8); for (int i = data.length - 1; i >= 0; i--) builder.addLong(data[i]); return builder.endVector(); } - public static void startLVector(FlatBufferBuilder builder, int numElems) { builder.startVector(8, numElems, 8); } - public static void addD(FlatBufferBuilder builder, int dOffset) { builder.addOffset(3, dOffset, 0); } - public static int createDVector(FlatBufferBuilder builder, double[] data) { builder.startVector(8, data.length, 8); for (int i = data.length - 1; i >= 0; i--) builder.addDouble(data[i]); return builder.endVector(); } - public static void startDVector(FlatBufferBuilder builder, int numElems) { builder.startVector(8, numElems, 8); } - public static void addA(FlatBufferBuilder builder, int aOffset) { builder.addOffset(4, aOffset, 0); } - public static int createAVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startAVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addB(FlatBufferBuilder builder, int bOffset) { builder.addOffset(5, bOffset, 0); } - public static int createBVector(FlatBufferBuilder builder, boolean[] data) { builder.startVector(1, data.length, 1); for (int i = data.length - 1; i >= 0; i--) builder.addBoolean(data[i]); return builder.endVector(); } - public static void startBVector(FlatBufferBuilder builder, int numElems) { builder.startVector(1, numElems, 1); } - public static void addS(FlatBufferBuilder builder, int sOffset) { builder.addOffset(6, sOffset, 0); } - public static int createSVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startSVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addShape(FlatBufferBuilder builder, int shapeOffset) { builder.addOffset(7, shapeOffset, 0); } - public static int createShapeVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addInt(data[i]); return builder.endVector(); } - public static void startShapeVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static int endFlatProperties(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } - public static void finishFlatPropertiesBuffer(FlatBufferBuilder builder, int offset) { builder.finish(offset); } - public static void finishSizePrefixedFlatPropertiesBuffer(FlatBufferBuilder builder, int offset) { builder.finishSizePrefixed(offset); } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.py b/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.py deleted file mode 100644 index e5a5ad5bf..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatProperties.py +++ /dev/null @@ -1,203 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatProperties(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatProperties(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatProperties() - x.Init(buf, n + offset) - return x - - # FlatProperties - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatProperties - def Name(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # FlatProperties - def I(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int32Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return 0 - - # FlatProperties - def IAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int32Flags, o) - return 0 - - # FlatProperties - def ILength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatProperties - def L(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int64Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 8)) - return 0 - - # FlatProperties - def LAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int64Flags, o) - return 0 - - # FlatProperties - def LLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatProperties - def D(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Float64Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 8)) - return 0 - - # FlatProperties - def DAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Float64Flags, o) - return 0 - - # FlatProperties - def DLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatProperties - def A(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatProperties - def ALength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatProperties - def B(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.BoolFlags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 1)) - return 0 - - # FlatProperties - def BAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.BoolFlags, o) - return 0 - - # FlatProperties - def BLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatProperties - def S(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatProperties - def SLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatProperties - def Shape(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int32Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return 0 - - # FlatProperties - def ShapeAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int32Flags, o) - return 0 - - # FlatProperties - def ShapeLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - -def FlatPropertiesStart(builder): builder.StartObject(8) -def FlatPropertiesAddName(builder, name): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(name), 0) -def FlatPropertiesAddI(builder, i): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(i), 0) -def FlatPropertiesStartIVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatPropertiesAddL(builder, l): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(l), 0) -def FlatPropertiesStartLVector(builder, numElems): return builder.StartVector(8, numElems, 8) -def FlatPropertiesAddD(builder, d): builder.PrependUOffsetTRelativeSlot(3, flatbuffers.number_types.UOffsetTFlags.py_type(d), 0) -def FlatPropertiesStartDVector(builder, numElems): return builder.StartVector(8, numElems, 8) -def FlatPropertiesAddA(builder, a): builder.PrependUOffsetTRelativeSlot(4, flatbuffers.number_types.UOffsetTFlags.py_type(a), 0) -def FlatPropertiesStartAVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatPropertiesAddB(builder, b): builder.PrependUOffsetTRelativeSlot(5, flatbuffers.number_types.UOffsetTFlags.py_type(b), 0) -def FlatPropertiesStartBVector(builder, numElems): return builder.StartVector(1, numElems, 1) -def FlatPropertiesAddS(builder, s): builder.PrependUOffsetTRelativeSlot(6, flatbuffers.number_types.UOffsetTFlags.py_type(s), 0) -def FlatPropertiesStartSVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatPropertiesAddShape(builder, shape): builder.PrependUOffsetTRelativeSlot(7, flatbuffers.number_types.UOffsetTFlags.py_type(shape), 0) -def FlatPropertiesStartShapeVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatPropertiesEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.cs deleted file mode 100644 index 2acd7728d..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.cs +++ /dev/null @@ -1,38 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatResponse : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatResponse GetRootAsFlatResponse(ByteBuffer _bb) { return GetRootAsFlatResponse(_bb, new FlatResponse()); } - public static FlatResponse GetRootAsFlatResponse(ByteBuffer _bb, FlatResponse obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatResponse __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int Status { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - - public static Offset CreateFlatResponse(FlatBufferBuilder builder, - int status = 0) { - builder.StartObject(1); - FlatResponse.AddStatus(builder, status); - return FlatResponse.EndFlatResponse(builder); - } - - public static void StartFlatResponse(FlatBufferBuilder builder) { builder.StartObject(1); } - public static void AddStatus(FlatBufferBuilder builder, int status) { builder.AddInt(0, status, 0); } - public static Offset EndFlatResponse(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.java b/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.java deleted file mode 100644 index 2fed88d5a..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.java +++ /dev/null @@ -1,33 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatResponse extends Table { - public static FlatResponse getRootAsFlatResponse(ByteBuffer _bb) { return getRootAsFlatResponse(_bb, new FlatResponse()); } - public static FlatResponse getRootAsFlatResponse(ByteBuffer _bb, FlatResponse obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatResponse __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int status() { int o = __offset(4); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - - public static int createFlatResponse(FlatBufferBuilder builder, - int status) { - builder.startObject(1); - FlatResponse.addStatus(builder, status); - return FlatResponse.endFlatResponse(builder); - } - - public static void startFlatResponse(FlatBufferBuilder builder) { builder.startObject(1); } - public static void addStatus(FlatBufferBuilder builder, int status) { builder.addInt(0, status, 0); } - public static int endFlatResponse(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.py b/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.py deleted file mode 100644 index a9df9b4ee..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatResponse.py +++ /dev/null @@ -1,44 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatResponse(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatResponse(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatResponse() - x.Init(buf, n + offset) - return x - - # FlatResponse - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatResponse - def Status(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - -def FlatResponseStart(builder): builder.StartObject(1) -def FlatResponseAddStatus(builder, status): builder.PrependInt32Slot(0, status, 0) -def FlatResponseEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatResult.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatResult.cs deleted file mode 100644 index 20147a93e..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatResult.cs +++ /dev/null @@ -1,64 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatResult : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatResult GetRootAsFlatResult(ByteBuffer _bb) { return GetRootAsFlatResult(_bb, new FlatResult()); } - public static FlatResult GetRootAsFlatResult(ByteBuffer _bb, FlatResult obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatResult __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long Id { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public FlatVariable? Variables(int j) { int o = __p.__offset(6); return o != 0 ? (FlatVariable?)(new FlatVariable()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int VariablesLength { get { int o = __p.__offset(6); return o != 0 ? __p.__vector_len(o) : 0; } } - public FlatTiming? Timing(int j) { int o = __p.__offset(8); return o != 0 ? (FlatTiming?)(new FlatTiming()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int TimingLength { get { int o = __p.__offset(8); return o != 0 ? __p.__vector_len(o) : 0; } } - public long FootprintForward { get { int o = __p.__offset(10); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long FootprintBackward { get { int o = __p.__offset(12); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - - public static Offset CreateFlatResult(FlatBufferBuilder builder, - long id = 0, - VectorOffset variablesOffset = default(VectorOffset), - VectorOffset timingOffset = default(VectorOffset), - long footprintForward = 0, - long footprintBackward = 0) { - builder.StartObject(5); - FlatResult.AddFootprintBackward(builder, footprintBackward); - FlatResult.AddFootprintForward(builder, footprintForward); - FlatResult.AddId(builder, id); - FlatResult.AddTiming(builder, timingOffset); - FlatResult.AddVariables(builder, variablesOffset); - return FlatResult.EndFlatResult(builder); - } - - public static void StartFlatResult(FlatBufferBuilder builder) { builder.StartObject(5); } - public static void AddId(FlatBufferBuilder builder, long id) { builder.AddLong(0, id, 0); } - public static void AddVariables(FlatBufferBuilder builder, VectorOffset variablesOffset) { builder.AddOffset(1, variablesOffset.Value, 0); } - public static VectorOffset CreateVariablesVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateVariablesVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartVariablesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddTiming(FlatBufferBuilder builder, VectorOffset timingOffset) { builder.AddOffset(2, timingOffset.Value, 0); } - public static VectorOffset CreateTimingVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateTimingVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartTimingVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddFootprintForward(FlatBufferBuilder builder, long footprintForward) { builder.AddLong(3, footprintForward, 0); } - public static void AddFootprintBackward(FlatBufferBuilder builder, long footprintBackward) { builder.AddLong(4, footprintBackward, 0); } - public static Offset EndFlatResult(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } - public static void FinishFlatResultBuffer(FlatBufferBuilder builder, Offset offset) { builder.Finish(offset.Value); } - public static void FinishSizePrefixedFlatResultBuffer(FlatBufferBuilder builder, Offset offset) { builder.FinishSizePrefixed(offset.Value); } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatResult.java b/libnd4j/include/graph/generated/nd4j/graph/FlatResult.java deleted file mode 100644 index 8424e3ad2..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatResult.java +++ /dev/null @@ -1,59 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatResult extends Table { - public static FlatResult getRootAsFlatResult(ByteBuffer _bb) { return getRootAsFlatResult(_bb, new FlatResult()); } - public static FlatResult getRootAsFlatResult(ByteBuffer _bb, FlatResult obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatResult __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long id() { int o = __offset(4); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public FlatVariable variables(int j) { return variables(new FlatVariable(), j); } - public FlatVariable variables(FlatVariable obj, int j) { int o = __offset(6); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int variablesLength() { int o = __offset(6); return o != 0 ? __vector_len(o) : 0; } - public FlatTiming timing(int j) { return timing(new FlatTiming(), j); } - public FlatTiming timing(FlatTiming obj, int j) { int o = __offset(8); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int timingLength() { int o = __offset(8); return o != 0 ? __vector_len(o) : 0; } - public long footprintForward() { int o = __offset(10); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long footprintBackward() { int o = __offset(12); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - - public static int createFlatResult(FlatBufferBuilder builder, - long id, - int variablesOffset, - int timingOffset, - long footprintForward, - long footprintBackward) { - builder.startObject(5); - FlatResult.addFootprintBackward(builder, footprintBackward); - FlatResult.addFootprintForward(builder, footprintForward); - FlatResult.addId(builder, id); - FlatResult.addTiming(builder, timingOffset); - FlatResult.addVariables(builder, variablesOffset); - return FlatResult.endFlatResult(builder); - } - - public static void startFlatResult(FlatBufferBuilder builder) { builder.startObject(5); } - public static void addId(FlatBufferBuilder builder, long id) { builder.addLong(0, id, 0L); } - public static void addVariables(FlatBufferBuilder builder, int variablesOffset) { builder.addOffset(1, variablesOffset, 0); } - public static int createVariablesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startVariablesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addTiming(FlatBufferBuilder builder, int timingOffset) { builder.addOffset(2, timingOffset, 0); } - public static int createTimingVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startTimingVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addFootprintForward(FlatBufferBuilder builder, long footprintForward) { builder.addLong(3, footprintForward, 0L); } - public static void addFootprintBackward(FlatBufferBuilder builder, long footprintBackward) { builder.addLong(4, footprintBackward, 0L); } - public static int endFlatResult(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } - public static void finishFlatResultBuffer(FlatBufferBuilder builder, int offset) { builder.finish(offset); } - public static void finishSizePrefixedFlatResultBuffer(FlatBufferBuilder builder, int offset) { builder.finishSizePrefixed(offset); } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatResult.py b/libnd4j/include/graph/generated/nd4j/graph/FlatResult.py deleted file mode 100644 index 48c43fe11..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatResult.py +++ /dev/null @@ -1,104 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatResult(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatResult(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatResult() - x.Init(buf, n + offset) - return x - - # FlatResult - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatResult - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # FlatResult - def Variables(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatVariable import FlatVariable - obj = FlatVariable() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatResult - def VariablesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatResult - def Timing(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatTiming import FlatTiming - obj = FlatTiming() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatResult - def TimingLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatResult - def FootprintForward(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # FlatResult - def FootprintBackward(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - -def FlatResultStart(builder): builder.StartObject(5) -def FlatResultAddId(builder, id): builder.PrependInt64Slot(0, id, 0) -def FlatResultAddVariables(builder, variables): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(variables), 0) -def FlatResultStartVariablesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatResultAddTiming(builder, timing): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(timing), 0) -def FlatResultStartTimingVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatResultAddFootprintForward(builder, footprintForward): builder.PrependInt64Slot(3, footprintForward, 0) -def FlatResultAddFootprintBackward(builder, footprintBackward): builder.PrependInt64Slot(4, footprintBackward, 0) -def FlatResultEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.cs deleted file mode 100644 index 16aa8895c..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.cs +++ /dev/null @@ -1,52 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatTiming : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatTiming GetRootAsFlatTiming(ByteBuffer _bb) { return GetRootAsFlatTiming(_bb, new FlatTiming()); } - public static FlatTiming GetRootAsFlatTiming(ByteBuffer _bb, FlatTiming obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatTiming __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int Id { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public string Name { get { int o = __p.__offset(6); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetNameBytes() { return __p.__vector_as_span(6); } -#else - public ArraySegment? GetNameBytes() { return __p.__vector_as_arraysegment(6); } -#endif - public byte[] GetNameArray() { return __p.__vector_as_array(6); } - public LongPair? Timing { get { int o = __p.__offset(8); return o != 0 ? (LongPair?)(new LongPair()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - - public static Offset CreateFlatTiming(FlatBufferBuilder builder, - int id = 0, - StringOffset nameOffset = default(StringOffset), - Offset timingOffset = default(Offset)) { - builder.StartObject(3); - FlatTiming.AddTiming(builder, timingOffset); - FlatTiming.AddName(builder, nameOffset); - FlatTiming.AddId(builder, id); - return FlatTiming.EndFlatTiming(builder); - } - - public static void StartFlatTiming(FlatBufferBuilder builder) { builder.StartObject(3); } - public static void AddId(FlatBufferBuilder builder, int id) { builder.AddInt(0, id, 0); } - public static void AddName(FlatBufferBuilder builder, StringOffset nameOffset) { builder.AddOffset(1, nameOffset.Value, 0); } - public static void AddTiming(FlatBufferBuilder builder, Offset timingOffset) { builder.AddOffset(2, timingOffset.Value, 0); } - public static Offset EndFlatTiming(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.java b/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.java deleted file mode 100644 index 926bf6811..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.java +++ /dev/null @@ -1,44 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatTiming extends Table { - public static FlatTiming getRootAsFlatTiming(ByteBuffer _bb) { return getRootAsFlatTiming(_bb, new FlatTiming()); } - public static FlatTiming getRootAsFlatTiming(ByteBuffer _bb, FlatTiming obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatTiming __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int id() { int o = __offset(4); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public String name() { int o = __offset(6); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer nameAsByteBuffer() { return __vector_as_bytebuffer(6, 1); } - public ByteBuffer nameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 6, 1); } - public LongPair timing() { return timing(new LongPair()); } - public LongPair timing(LongPair obj) { int o = __offset(8); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - - public static int createFlatTiming(FlatBufferBuilder builder, - int id, - int nameOffset, - int timingOffset) { - builder.startObject(3); - FlatTiming.addTiming(builder, timingOffset); - FlatTiming.addName(builder, nameOffset); - FlatTiming.addId(builder, id); - return FlatTiming.endFlatTiming(builder); - } - - public static void startFlatTiming(FlatBufferBuilder builder) { builder.startObject(3); } - public static void addId(FlatBufferBuilder builder, int id) { builder.addInt(0, id, 0); } - public static void addName(FlatBufferBuilder builder, int nameOffset) { builder.addOffset(1, nameOffset, 0); } - public static void addTiming(FlatBufferBuilder builder, int timingOffset) { builder.addOffset(2, timingOffset, 0); } - public static int endFlatTiming(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.py b/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.py deleted file mode 100644 index 3ecaa2380..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatTiming.py +++ /dev/null @@ -1,64 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatTiming(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatTiming(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatTiming() - x.Init(buf, n + offset) - return x - - # FlatTiming - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatTiming - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # FlatTiming - def Name(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # FlatTiming - def Timing(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .LongPair import LongPair - obj = LongPair() - obj.Init(self._tab.Bytes, x) - return obj - return None - -def FlatTimingStart(builder): builder.StartObject(3) -def FlatTimingAddId(builder, id): builder.PrependInt32Slot(0, id, 0) -def FlatTimingAddName(builder, name): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(name), 0) -def FlatTimingAddTiming(builder, timing): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(timing), 0) -def FlatTimingEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.cs b/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.cs deleted file mode 100644 index 9331965be..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.cs +++ /dev/null @@ -1,104 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FlatVariable : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FlatVariable GetRootAsFlatVariable(ByteBuffer _bb) { return GetRootAsFlatVariable(_bb, new FlatVariable()); } - public static FlatVariable GetRootAsFlatVariable(ByteBuffer _bb, FlatVariable obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FlatVariable __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public IntPair? Id { get { int o = __p.__offset(4); return o != 0 ? (IntPair?)(new IntPair()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public string Name { get { int o = __p.__offset(6); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetNameBytes() { return __p.__vector_as_span(6); } -#else - public ArraySegment? GetNameBytes() { return __p.__vector_as_arraysegment(6); } -#endif - public byte[] GetNameArray() { return __p.__vector_as_array(6); } - public DType Dtype { get { int o = __p.__offset(8); return o != 0 ? (DType)__p.bb.GetSbyte(o + __p.bb_pos) : DType.INHERIT; } } - public long Shape(int j) { int o = __p.__offset(10); return o != 0 ? __p.bb.GetLong(__p.__vector(o) + j * 8) : (long)0; } - public int ShapeLength { get { int o = __p.__offset(10); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetShapeBytes() { return __p.__vector_as_span(10); } -#else - public ArraySegment? GetShapeBytes() { return __p.__vector_as_arraysegment(10); } -#endif - public long[] GetShapeArray() { return __p.__vector_as_array(10); } - public FlatArray? Ndarray { get { int o = __p.__offset(12); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public int Device { get { int o = __p.__offset(14); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public VarType Variabletype { get { int o = __p.__offset(16); return o != 0 ? (VarType)__p.bb.GetSbyte(o + __p.bb_pos) : VarType.VARIABLE; } } - public string ControlDeps(int j) { int o = __p.__offset(18); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepsLength { get { int o = __p.__offset(18); return o != 0 ? __p.__vector_len(o) : 0; } } - public string ControlDepForOp(int j) { int o = __p.__offset(20); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepForOpLength { get { int o = __p.__offset(20); return o != 0 ? __p.__vector_len(o) : 0; } } - public string ControlDepsForVar(int j) { int o = __p.__offset(22); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepsForVarLength { get { int o = __p.__offset(22); return o != 0 ? __p.__vector_len(o) : 0; } } - - public static Offset CreateFlatVariable(FlatBufferBuilder builder, - Offset idOffset = default(Offset), - StringOffset nameOffset = default(StringOffset), - DType dtype = DType.INHERIT, - VectorOffset shapeOffset = default(VectorOffset), - Offset ndarrayOffset = default(Offset), - int device = 0, - VarType variabletype = VarType.VARIABLE, - VectorOffset controlDepsOffset = default(VectorOffset), - VectorOffset controlDepForOpOffset = default(VectorOffset), - VectorOffset controlDepsForVarOffset = default(VectorOffset)) { - builder.StartObject(10); - FlatVariable.AddControlDepsForVar(builder, controlDepsForVarOffset); - FlatVariable.AddControlDepForOp(builder, controlDepForOpOffset); - FlatVariable.AddControlDeps(builder, controlDepsOffset); - FlatVariable.AddDevice(builder, device); - FlatVariable.AddNdarray(builder, ndarrayOffset); - FlatVariable.AddShape(builder, shapeOffset); - FlatVariable.AddName(builder, nameOffset); - FlatVariable.AddId(builder, idOffset); - FlatVariable.AddVariabletype(builder, variabletype); - FlatVariable.AddDtype(builder, dtype); - return FlatVariable.EndFlatVariable(builder); - } - - public static void StartFlatVariable(FlatBufferBuilder builder) { builder.StartObject(10); } - public static void AddId(FlatBufferBuilder builder, Offset idOffset) { builder.AddOffset(0, idOffset.Value, 0); } - public static void AddName(FlatBufferBuilder builder, StringOffset nameOffset) { builder.AddOffset(1, nameOffset.Value, 0); } - public static void AddDtype(FlatBufferBuilder builder, DType dtype) { builder.AddSbyte(2, (sbyte)dtype, 0); } - public static void AddShape(FlatBufferBuilder builder, VectorOffset shapeOffset) { builder.AddOffset(3, shapeOffset.Value, 0); } - public static VectorOffset CreateShapeVector(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); for (int i = data.Length - 1; i >= 0; i--) builder.AddLong(data[i]); return builder.EndVector(); } - public static VectorOffset CreateShapeVectorBlock(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); builder.Add(data); return builder.EndVector(); } - public static void StartShapeVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(8, numElems, 8); } - public static void AddNdarray(FlatBufferBuilder builder, Offset ndarrayOffset) { builder.AddOffset(4, ndarrayOffset.Value, 0); } - public static void AddDevice(FlatBufferBuilder builder, int device) { builder.AddInt(5, device, 0); } - public static void AddVariabletype(FlatBufferBuilder builder, VarType variabletype) { builder.AddSbyte(6, (sbyte)variabletype, 0); } - public static void AddControlDeps(FlatBufferBuilder builder, VectorOffset controlDepsOffset) { builder.AddOffset(7, controlDepsOffset.Value, 0); } - public static VectorOffset CreateControlDepsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddControlDepForOp(FlatBufferBuilder builder, VectorOffset controlDepForOpOffset) { builder.AddOffset(8, controlDepForOpOffset.Value, 0); } - public static VectorOffset CreateControlDepForOpVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepForOpVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepForOpVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddControlDepsForVar(FlatBufferBuilder builder, VectorOffset controlDepsForVarOffset) { builder.AddOffset(9, controlDepsForVarOffset.Value, 0); } - public static VectorOffset CreateControlDepsForVarVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepsForVarVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepsForVarVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static Offset EndFlatVariable(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } - public static void FinishFlatVariableBuffer(FlatBufferBuilder builder, Offset offset) { builder.Finish(offset.Value); } - public static void FinishSizePrefixedFlatVariableBuffer(FlatBufferBuilder builder, Offset offset) { builder.FinishSizePrefixed(offset.Value); } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.java b/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.java deleted file mode 100644 index d73c990bb..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.java +++ /dev/null @@ -1,89 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FlatVariable extends Table { - public static FlatVariable getRootAsFlatVariable(ByteBuffer _bb) { return getRootAsFlatVariable(_bb, new FlatVariable()); } - public static FlatVariable getRootAsFlatVariable(ByteBuffer _bb, FlatVariable obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FlatVariable __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public IntPair id() { return id(new IntPair()); } - public IntPair id(IntPair obj) { int o = __offset(4); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public String name() { int o = __offset(6); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer nameAsByteBuffer() { return __vector_as_bytebuffer(6, 1); } - public ByteBuffer nameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 6, 1); } - public byte dtype() { int o = __offset(8); return o != 0 ? bb.get(o + bb_pos) : 0; } - public long shape(int j) { int o = __offset(10); return o != 0 ? bb.getLong(__vector(o) + j * 8) : 0; } - public int shapeLength() { int o = __offset(10); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer shapeAsByteBuffer() { return __vector_as_bytebuffer(10, 8); } - public ByteBuffer shapeInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 10, 8); } - public FlatArray ndarray() { return ndarray(new FlatArray()); } - public FlatArray ndarray(FlatArray obj) { int o = __offset(12); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public int device() { int o = __offset(14); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public byte variabletype() { int o = __offset(16); return o != 0 ? bb.get(o + bb_pos) : 0; } - public String controlDeps(int j) { int o = __offset(18); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepsLength() { int o = __offset(18); return o != 0 ? __vector_len(o) : 0; } - public String controlDepForOp(int j) { int o = __offset(20); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepForOpLength() { int o = __offset(20); return o != 0 ? __vector_len(o) : 0; } - public String controlDepsForVar(int j) { int o = __offset(22); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepsForVarLength() { int o = __offset(22); return o != 0 ? __vector_len(o) : 0; } - - public static int createFlatVariable(FlatBufferBuilder builder, - int idOffset, - int nameOffset, - byte dtype, - int shapeOffset, - int ndarrayOffset, - int device, - byte variabletype, - int controlDepsOffset, - int controlDepForOpOffset, - int controlDepsForVarOffset) { - builder.startObject(10); - FlatVariable.addControlDepsForVar(builder, controlDepsForVarOffset); - FlatVariable.addControlDepForOp(builder, controlDepForOpOffset); - FlatVariable.addControlDeps(builder, controlDepsOffset); - FlatVariable.addDevice(builder, device); - FlatVariable.addNdarray(builder, ndarrayOffset); - FlatVariable.addShape(builder, shapeOffset); - FlatVariable.addName(builder, nameOffset); - FlatVariable.addId(builder, idOffset); - FlatVariable.addVariabletype(builder, variabletype); - FlatVariable.addDtype(builder, dtype); - return FlatVariable.endFlatVariable(builder); - } - - public static void startFlatVariable(FlatBufferBuilder builder) { builder.startObject(10); } - public static void addId(FlatBufferBuilder builder, int idOffset) { builder.addOffset(0, idOffset, 0); } - public static void addName(FlatBufferBuilder builder, int nameOffset) { builder.addOffset(1, nameOffset, 0); } - public static void addDtype(FlatBufferBuilder builder, byte dtype) { builder.addByte(2, dtype, 0); } - public static void addShape(FlatBufferBuilder builder, int shapeOffset) { builder.addOffset(3, shapeOffset, 0); } - public static int createShapeVector(FlatBufferBuilder builder, long[] data) { builder.startVector(8, data.length, 8); for (int i = data.length - 1; i >= 0; i--) builder.addLong(data[i]); return builder.endVector(); } - public static void startShapeVector(FlatBufferBuilder builder, int numElems) { builder.startVector(8, numElems, 8); } - public static void addNdarray(FlatBufferBuilder builder, int ndarrayOffset) { builder.addOffset(4, ndarrayOffset, 0); } - public static void addDevice(FlatBufferBuilder builder, int device) { builder.addInt(5, device, 0); } - public static void addVariabletype(FlatBufferBuilder builder, byte variabletype) { builder.addByte(6, variabletype, 0); } - public static void addControlDeps(FlatBufferBuilder builder, int controlDepsOffset) { builder.addOffset(7, controlDepsOffset, 0); } - public static int createControlDepsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addControlDepForOp(FlatBufferBuilder builder, int controlDepForOpOffset) { builder.addOffset(8, controlDepForOpOffset, 0); } - public static int createControlDepForOpVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepForOpVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addControlDepsForVar(FlatBufferBuilder builder, int controlDepsForVarOffset) { builder.addOffset(9, controlDepsForVarOffset, 0); } - public static int createControlDepsForVarVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepsForVarVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static int endFlatVariable(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } - public static void finishFlatVariableBuffer(FlatBufferBuilder builder, int offset) { builder.finish(offset); } - public static void finishSizePrefixedFlatVariableBuffer(FlatBufferBuilder builder, int offset) { builder.finishSizePrefixed(offset); } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.py b/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.py deleted file mode 100644 index 2c2a42374..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FlatVariable.py +++ /dev/null @@ -1,167 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FlatVariable(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFlatVariable(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FlatVariable() - x.Init(buf, n + offset) - return x - - # FlatVariable - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FlatVariable - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .IntPair import IntPair - obj = IntPair() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatVariable - def Name(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # FlatVariable - def Dtype(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # FlatVariable - def Shape(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int64Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 8)) - return 0 - - # FlatVariable - def ShapeAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int64Flags, o) - return 0 - - # FlatVariable - def ShapeLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatVariable - def Ndarray(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # FlatVariable - def Device(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # FlatVariable - def Variabletype(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # FlatVariable - def ControlDeps(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatVariable - def ControlDepsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatVariable - def ControlDepForOp(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatVariable - def ControlDepForOpLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # FlatVariable - def ControlDepsForVar(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # FlatVariable - def ControlDepsForVarLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - -def FlatVariableStart(builder): builder.StartObject(10) -def FlatVariableAddId(builder, id): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(id), 0) -def FlatVariableAddName(builder, name): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(name), 0) -def FlatVariableAddDtype(builder, dtype): builder.PrependInt8Slot(2, dtype, 0) -def FlatVariableAddShape(builder, shape): builder.PrependUOffsetTRelativeSlot(3, flatbuffers.number_types.UOffsetTFlags.py_type(shape), 0) -def FlatVariableStartShapeVector(builder, numElems): return builder.StartVector(8, numElems, 8) -def FlatVariableAddNdarray(builder, ndarray): builder.PrependUOffsetTRelativeSlot(4, flatbuffers.number_types.UOffsetTFlags.py_type(ndarray), 0) -def FlatVariableAddDevice(builder, device): builder.PrependInt32Slot(5, device, 0) -def FlatVariableAddVariabletype(builder, variabletype): builder.PrependInt8Slot(6, variabletype, 0) -def FlatVariableAddControlDeps(builder, controlDeps): builder.PrependUOffsetTRelativeSlot(7, flatbuffers.number_types.UOffsetTFlags.py_type(controlDeps), 0) -def FlatVariableStartControlDepsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatVariableAddControlDepForOp(builder, controlDepForOp): builder.PrependUOffsetTRelativeSlot(8, flatbuffers.number_types.UOffsetTFlags.py_type(controlDepForOp), 0) -def FlatVariableStartControlDepForOpVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatVariableAddControlDepsForVar(builder, controlDepsForVar): builder.PrependUOffsetTRelativeSlot(9, flatbuffers.number_types.UOffsetTFlags.py_type(controlDepsForVar), 0) -def FlatVariableStartControlDepsForVarVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def FlatVariableEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.cs b/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.cs deleted file mode 100644 index 6b73ff261..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.cs +++ /dev/null @@ -1,48 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct FrameIteration : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static FrameIteration GetRootAsFrameIteration(ByteBuffer _bb) { return GetRootAsFrameIteration(_bb, new FrameIteration()); } - public static FrameIteration GetRootAsFrameIteration(ByteBuffer _bb, FrameIteration obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public FrameIteration __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public string Frame { get { int o = __p.__offset(4); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetFrameBytes() { return __p.__vector_as_span(4); } -#else - public ArraySegment? GetFrameBytes() { return __p.__vector_as_arraysegment(4); } -#endif - public byte[] GetFrameArray() { return __p.__vector_as_array(4); } - public ushort Iteration { get { int o = __p.__offset(6); return o != 0 ? __p.bb.GetUshort(o + __p.bb_pos) : (ushort)0; } } - - public static Offset CreateFrameIteration(FlatBufferBuilder builder, - StringOffset frameOffset = default(StringOffset), - ushort iteration = 0) { - builder.StartObject(2); - FrameIteration.AddFrame(builder, frameOffset); - FrameIteration.AddIteration(builder, iteration); - return FrameIteration.EndFrameIteration(builder); - } - - public static void StartFrameIteration(FlatBufferBuilder builder) { builder.StartObject(2); } - public static void AddFrame(FlatBufferBuilder builder, StringOffset frameOffset) { builder.AddOffset(0, frameOffset.Value, 0); } - public static void AddIteration(FlatBufferBuilder builder, ushort iteration) { builder.AddUshort(1, iteration, 0); } - public static Offset EndFrameIteration(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.java b/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.java deleted file mode 100644 index 58690c018..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.java +++ /dev/null @@ -1,39 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class FrameIteration extends Table { - public static FrameIteration getRootAsFrameIteration(ByteBuffer _bb) { return getRootAsFrameIteration(_bb, new FrameIteration()); } - public static FrameIteration getRootAsFrameIteration(ByteBuffer _bb, FrameIteration obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public FrameIteration __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public String frame() { int o = __offset(4); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer frameAsByteBuffer() { return __vector_as_bytebuffer(4, 1); } - public ByteBuffer frameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 4, 1); } - public int iteration() { int o = __offset(6); return o != 0 ? bb.getShort(o + bb_pos) & 0xFFFF : 0; } - - public static int createFrameIteration(FlatBufferBuilder builder, - int frameOffset, - int iteration) { - builder.startObject(2); - FrameIteration.addFrame(builder, frameOffset); - FrameIteration.addIteration(builder, iteration); - return FrameIteration.endFrameIteration(builder); - } - - public static void startFrameIteration(FlatBufferBuilder builder) { builder.startObject(2); } - public static void addFrame(FlatBufferBuilder builder, int frameOffset) { builder.addOffset(0, frameOffset, 0); } - public static void addIteration(FlatBufferBuilder builder, int iteration) { builder.addShort(1, (short)iteration, (short)0); } - public static int endFrameIteration(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.py b/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.py deleted file mode 100644 index 7c125181e..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/FrameIteration.py +++ /dev/null @@ -1,52 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class FrameIteration(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsFrameIteration(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = FrameIteration() - x.Init(buf, n + offset) - return x - - # FrameIteration - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # FrameIteration - def Frame(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # FrameIteration - def Iteration(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Uint16Flags, o + self._tab.Pos) - return 0 - -def FrameIterationStart(builder): builder.StartObject(2) -def FrameIterationAddFrame(builder, frame): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(frame), 0) -def FrameIterationAddIteration(builder, iteration): builder.PrependUint16Slot(1, iteration, 0) -def FrameIterationEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/GraphInferenceServerGrpc.java b/libnd4j/include/graph/generated/nd4j/graph/GraphInferenceServerGrpc.java deleted file mode 100644 index cc9f1240d..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/GraphInferenceServerGrpc.java +++ /dev/null @@ -1,551 +0,0 @@ -//Generated by flatc compiler (version 1.10.0) -//If you make any local changes, they will be lost -//source: graph.fbs - -package nd4j.graph; - -import com.google.flatbuffers.grpc.FlatbuffersUtils; - -import java.nio.ByteBuffer; -import static io.grpc.MethodDescriptor.generateFullMethodName; -import static io.grpc.stub.ClientCalls.asyncBidiStreamingCall; -import static io.grpc.stub.ClientCalls.asyncClientStreamingCall; -import static io.grpc.stub.ClientCalls.asyncServerStreamingCall; -import static io.grpc.stub.ClientCalls.asyncUnaryCall; -import static io.grpc.stub.ClientCalls.blockingServerStreamingCall; -import static io.grpc.stub.ClientCalls.blockingUnaryCall; -import static io.grpc.stub.ClientCalls.futureUnaryCall; -import static io.grpc.stub.ServerCalls.asyncBidiStreamingCall; -import static io.grpc.stub.ServerCalls.asyncClientStreamingCall; -import static io.grpc.stub.ServerCalls.asyncServerStreamingCall; -import static io.grpc.stub.ServerCalls.asyncUnaryCall; -import static io.grpc.stub.ServerCalls.asyncUnimplementedStreamingCall; -import static io.grpc.stub.ServerCalls.asyncUnimplementedUnaryCall; - -/** - */ -@javax.annotation.Generated( - value = "by gRPC proto compiler", - comments = "Source: graph.fbs") -public final class GraphInferenceServerGrpc { - - private GraphInferenceServerGrpc() {} - - public static final String SERVICE_NAME = "nd4j.graph.GraphInferenceServer"; - - // Static method descriptors that strictly reflect the proto. - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - @java.lang.Deprecated // Use {@link #getRegisterGraphMethod()} instead. - public static final io.grpc.MethodDescriptor METHOD_REGISTER_GRAPH = getRegisterGraphMethod(); - - private static volatile io.grpc.MethodDescriptor getRegisterGraphMethod; - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatGraph; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatGraph() { - if (extractorOfFlatGraph != null) return extractorOfFlatGraph; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatGraph != null) return extractorOfFlatGraph; - extractorOfFlatGraph = new FlatbuffersUtils.FBExtactor() { - public nd4j.graph.FlatGraph extract (ByteBuffer buffer) { - return nd4j.graph.FlatGraph.getRootAsFlatGraph(buffer); - } - }; - return extractorOfFlatGraph; - } - } - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatResponse; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatResponse() { - if (extractorOfFlatResponse != null) return extractorOfFlatResponse; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatResponse != null) return extractorOfFlatResponse; - extractorOfFlatResponse = new FlatbuffersUtils.FBExtactor() { - public nd4j.graph.FlatResponse extract (ByteBuffer buffer) { - return nd4j.graph.FlatResponse.getRootAsFlatResponse(buffer); - } - }; - return extractorOfFlatResponse; - } - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - public static io.grpc.MethodDescriptor getRegisterGraphMethod() { - io.grpc.MethodDescriptor getRegisterGraphMethod; - if ((getRegisterGraphMethod = GraphInferenceServerGrpc.getRegisterGraphMethod) == null) { - synchronized (GraphInferenceServerGrpc.class) { - if ((getRegisterGraphMethod = GraphInferenceServerGrpc.getRegisterGraphMethod) == null) { - GraphInferenceServerGrpc.getRegisterGraphMethod = getRegisterGraphMethod = - io.grpc.MethodDescriptor.newBuilder() - .setType(io.grpc.MethodDescriptor.MethodType.UNARY) - .setFullMethodName(generateFullMethodName( - "nd4j.graph.GraphInferenceServer", "RegisterGraph")) - .setSampledToLocalTracing(true) - .setRequestMarshaller(FlatbuffersUtils.marshaller( - nd4j.graph.FlatGraph.class, getExtractorOfFlatGraph())) - .setResponseMarshaller(FlatbuffersUtils.marshaller( - nd4j.graph.FlatResponse.class, getExtractorOfFlatResponse())) - .setSchemaDescriptor(null) - .build(); - } - } - } - return getRegisterGraphMethod; - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - @java.lang.Deprecated // Use {@link #getForgetGraphMethod()} instead. - public static final io.grpc.MethodDescriptor METHOD_FORGET_GRAPH = getForgetGraphMethod(); - - private static volatile io.grpc.MethodDescriptor getForgetGraphMethod; - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatDropRequest; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatDropRequest() { - if (extractorOfFlatDropRequest != null) return extractorOfFlatDropRequest; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatDropRequest != null) return extractorOfFlatDropRequest; - extractorOfFlatDropRequest = new FlatbuffersUtils.FBExtactor() { - public nd4j.graph.FlatDropRequest extract (ByteBuffer buffer) { - return nd4j.graph.FlatDropRequest.getRootAsFlatDropRequest(buffer); - } - }; - return extractorOfFlatDropRequest; - } - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - public static io.grpc.MethodDescriptor getForgetGraphMethod() { - io.grpc.MethodDescriptor getForgetGraphMethod; - if ((getForgetGraphMethod = GraphInferenceServerGrpc.getForgetGraphMethod) == null) { - synchronized (GraphInferenceServerGrpc.class) { - if ((getForgetGraphMethod = GraphInferenceServerGrpc.getForgetGraphMethod) == null) { - GraphInferenceServerGrpc.getForgetGraphMethod = getForgetGraphMethod = - io.grpc.MethodDescriptor.newBuilder() - .setType(io.grpc.MethodDescriptor.MethodType.UNARY) - .setFullMethodName(generateFullMethodName( - "nd4j.graph.GraphInferenceServer", "ForgetGraph")) - .setSampledToLocalTracing(true) - .setRequestMarshaller(FlatbuffersUtils.marshaller( - nd4j.graph.FlatDropRequest.class, getExtractorOfFlatDropRequest())) - .setResponseMarshaller(FlatbuffersUtils.marshaller( - nd4j.graph.FlatResponse.class, getExtractorOfFlatResponse())) - .setSchemaDescriptor(null) - .build(); - } - } - } - return getForgetGraphMethod; - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - @java.lang.Deprecated // Use {@link #getReplaceGraphMethod()} instead. - public static final io.grpc.MethodDescriptor METHOD_REPLACE_GRAPH = getReplaceGraphMethod(); - - private static volatile io.grpc.MethodDescriptor getReplaceGraphMethod; - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - public static io.grpc.MethodDescriptor getReplaceGraphMethod() { - io.grpc.MethodDescriptor getReplaceGraphMethod; - if ((getReplaceGraphMethod = GraphInferenceServerGrpc.getReplaceGraphMethod) == null) { - synchronized (GraphInferenceServerGrpc.class) { - if ((getReplaceGraphMethod = GraphInferenceServerGrpc.getReplaceGraphMethod) == null) { - GraphInferenceServerGrpc.getReplaceGraphMethod = getReplaceGraphMethod = - io.grpc.MethodDescriptor.newBuilder() - .setType(io.grpc.MethodDescriptor.MethodType.UNARY) - .setFullMethodName(generateFullMethodName( - "nd4j.graph.GraphInferenceServer", "ReplaceGraph")) - .setSampledToLocalTracing(true) - .setRequestMarshaller(FlatbuffersUtils.marshaller( - nd4j.graph.FlatGraph.class, getExtractorOfFlatGraph())) - .setResponseMarshaller(FlatbuffersUtils.marshaller( - nd4j.graph.FlatResponse.class, getExtractorOfFlatResponse())) - .setSchemaDescriptor(null) - .build(); - } - } - } - return getReplaceGraphMethod; - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - @java.lang.Deprecated // Use {@link #getInferenceRequestMethod()} instead. - public static final io.grpc.MethodDescriptor METHOD_INFERENCE_REQUEST = getInferenceRequestMethod(); - - private static volatile io.grpc.MethodDescriptor getInferenceRequestMethod; - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatInferenceRequest; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatInferenceRequest() { - if (extractorOfFlatInferenceRequest != null) return extractorOfFlatInferenceRequest; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatInferenceRequest != null) return extractorOfFlatInferenceRequest; - extractorOfFlatInferenceRequest = new FlatbuffersUtils.FBExtactor() { - public nd4j.graph.FlatInferenceRequest extract (ByteBuffer buffer) { - return nd4j.graph.FlatInferenceRequest.getRootAsFlatInferenceRequest(buffer); - } - }; - return extractorOfFlatInferenceRequest; - } - } - - private static volatile FlatbuffersUtils.FBExtactor extractorOfFlatResult; - private static FlatbuffersUtils.FBExtactor getExtractorOfFlatResult() { - if (extractorOfFlatResult != null) return extractorOfFlatResult; - synchronized (GraphInferenceServerGrpc.class) { - if (extractorOfFlatResult != null) return extractorOfFlatResult; - extractorOfFlatResult = new FlatbuffersUtils.FBExtactor() { - public nd4j.graph.FlatResult extract (ByteBuffer buffer) { - return nd4j.graph.FlatResult.getRootAsFlatResult(buffer); - } - }; - return extractorOfFlatResult; - } - } - - @io.grpc.ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") - public static io.grpc.MethodDescriptor getInferenceRequestMethod() { - io.grpc.MethodDescriptor getInferenceRequestMethod; - if ((getInferenceRequestMethod = GraphInferenceServerGrpc.getInferenceRequestMethod) == null) { - synchronized (GraphInferenceServerGrpc.class) { - if ((getInferenceRequestMethod = GraphInferenceServerGrpc.getInferenceRequestMethod) == null) { - GraphInferenceServerGrpc.getInferenceRequestMethod = getInferenceRequestMethod = - io.grpc.MethodDescriptor.newBuilder() - .setType(io.grpc.MethodDescriptor.MethodType.UNARY) - .setFullMethodName(generateFullMethodName( - "nd4j.graph.GraphInferenceServer", "InferenceRequest")) - .setSampledToLocalTracing(true) - .setRequestMarshaller(FlatbuffersUtils.marshaller( - nd4j.graph.FlatInferenceRequest.class, getExtractorOfFlatInferenceRequest())) - .setResponseMarshaller(FlatbuffersUtils.marshaller( - nd4j.graph.FlatResult.class, getExtractorOfFlatResult())) - .setSchemaDescriptor(null) - .build(); - } - } - } - return getInferenceRequestMethod; - } - - /** - * Creates a new async stub that supports all call types for the service - */ - public static GraphInferenceServerStub newStub(io.grpc.Channel channel) { - return new GraphInferenceServerStub(channel); - } - - /** - * Creates a new blocking-style stub that supports unary and streaming output calls on the service - */ - public static GraphInferenceServerBlockingStub newBlockingStub( - io.grpc.Channel channel) { - return new GraphInferenceServerBlockingStub(channel); - } - - /** - * Creates a new ListenableFuture-style stub that supports unary calls on the service - */ - public static GraphInferenceServerFutureStub newFutureStub( - io.grpc.Channel channel) { - return new GraphInferenceServerFutureStub(channel); - } - - /** - */ - public static abstract class GraphInferenceServerImplBase implements io.grpc.BindableService { - - /** - */ - public void registerGraph(nd4j.graph.FlatGraph request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnimplementedUnaryCall(getRegisterGraphMethod(), responseObserver); - } - - /** - */ - public void forgetGraph(nd4j.graph.FlatDropRequest request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnimplementedUnaryCall(getForgetGraphMethod(), responseObserver); - } - - /** - */ - public void replaceGraph(nd4j.graph.FlatGraph request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnimplementedUnaryCall(getReplaceGraphMethod(), responseObserver); - } - - /** - */ - public void inferenceRequest(nd4j.graph.FlatInferenceRequest request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnimplementedUnaryCall(getInferenceRequestMethod(), responseObserver); - } - - @java.lang.Override public final io.grpc.ServerServiceDefinition bindService() { - return io.grpc.ServerServiceDefinition.builder(getServiceDescriptor()) - .addMethod( - getRegisterGraphMethod(), - asyncUnaryCall( - new MethodHandlers< - nd4j.graph.FlatGraph, - nd4j.graph.FlatResponse>( - this, METHODID_REGISTER_GRAPH))) - .addMethod( - getForgetGraphMethod(), - asyncUnaryCall( - new MethodHandlers< - nd4j.graph.FlatDropRequest, - nd4j.graph.FlatResponse>( - this, METHODID_FORGET_GRAPH))) - .addMethod( - getReplaceGraphMethod(), - asyncUnaryCall( - new MethodHandlers< - nd4j.graph.FlatGraph, - nd4j.graph.FlatResponse>( - this, METHODID_REPLACE_GRAPH))) - .addMethod( - getInferenceRequestMethod(), - asyncUnaryCall( - new MethodHandlers< - nd4j.graph.FlatInferenceRequest, - nd4j.graph.FlatResult>( - this, METHODID_INFERENCE_REQUEST))) - .build(); - } - } - - /** - */ - public static final class GraphInferenceServerStub extends io.grpc.stub.AbstractStub { - private GraphInferenceServerStub(io.grpc.Channel channel) { - super(channel); - } - - private GraphInferenceServerStub(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - super(channel, callOptions); - } - - @java.lang.Override - protected GraphInferenceServerStub build(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - return new GraphInferenceServerStub(channel, callOptions); - } - - /** - */ - public void registerGraph(nd4j.graph.FlatGraph request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnaryCall( - getChannel().newCall(getRegisterGraphMethod(), getCallOptions()), request, responseObserver); - } - - /** - */ - public void forgetGraph(nd4j.graph.FlatDropRequest request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnaryCall( - getChannel().newCall(getForgetGraphMethod(), getCallOptions()), request, responseObserver); - } - - /** - */ - public void replaceGraph(nd4j.graph.FlatGraph request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnaryCall( - getChannel().newCall(getReplaceGraphMethod(), getCallOptions()), request, responseObserver); - } - - /** - */ - public void inferenceRequest(nd4j.graph.FlatInferenceRequest request, - io.grpc.stub.StreamObserver responseObserver) { - asyncUnaryCall( - getChannel().newCall(getInferenceRequestMethod(), getCallOptions()), request, responseObserver); - } - } - - /** - */ - public static final class GraphInferenceServerBlockingStub extends io.grpc.stub.AbstractStub { - private GraphInferenceServerBlockingStub(io.grpc.Channel channel) { - super(channel); - } - - private GraphInferenceServerBlockingStub(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - super(channel, callOptions); - } - - @java.lang.Override - protected GraphInferenceServerBlockingStub build(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - return new GraphInferenceServerBlockingStub(channel, callOptions); - } - - /** - */ - public nd4j.graph.FlatResponse registerGraph(nd4j.graph.FlatGraph request) { - return blockingUnaryCall( - getChannel(), getRegisterGraphMethod(), getCallOptions(), request); - } - - /** - */ - public nd4j.graph.FlatResponse forgetGraph(nd4j.graph.FlatDropRequest request) { - return blockingUnaryCall( - getChannel(), getForgetGraphMethod(), getCallOptions(), request); - } - - /** - */ - public nd4j.graph.FlatResponse replaceGraph(nd4j.graph.FlatGraph request) { - return blockingUnaryCall( - getChannel(), getReplaceGraphMethod(), getCallOptions(), request); - } - - /** - */ - public nd4j.graph.FlatResult inferenceRequest(nd4j.graph.FlatInferenceRequest request) { - return blockingUnaryCall( - getChannel(), getInferenceRequestMethod(), getCallOptions(), request); - } - } - - /** - */ - public static final class GraphInferenceServerFutureStub extends io.grpc.stub.AbstractStub { - private GraphInferenceServerFutureStub(io.grpc.Channel channel) { - super(channel); - } - - private GraphInferenceServerFutureStub(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - super(channel, callOptions); - } - - @java.lang.Override - protected GraphInferenceServerFutureStub build(io.grpc.Channel channel, - io.grpc.CallOptions callOptions) { - return new GraphInferenceServerFutureStub(channel, callOptions); - } - - /** - */ - public com.google.common.util.concurrent.ListenableFuture registerGraph( - nd4j.graph.FlatGraph request) { - return futureUnaryCall( - getChannel().newCall(getRegisterGraphMethod(), getCallOptions()), request); - } - - /** - */ - public com.google.common.util.concurrent.ListenableFuture forgetGraph( - nd4j.graph.FlatDropRequest request) { - return futureUnaryCall( - getChannel().newCall(getForgetGraphMethod(), getCallOptions()), request); - } - - /** - */ - public com.google.common.util.concurrent.ListenableFuture replaceGraph( - nd4j.graph.FlatGraph request) { - return futureUnaryCall( - getChannel().newCall(getReplaceGraphMethod(), getCallOptions()), request); - } - - /** - */ - public com.google.common.util.concurrent.ListenableFuture inferenceRequest( - nd4j.graph.FlatInferenceRequest request) { - return futureUnaryCall( - getChannel().newCall(getInferenceRequestMethod(), getCallOptions()), request); - } - } - - private static final int METHODID_REGISTER_GRAPH = 0; - private static final int METHODID_FORGET_GRAPH = 1; - private static final int METHODID_REPLACE_GRAPH = 2; - private static final int METHODID_INFERENCE_REQUEST = 3; - - private static final class MethodHandlers implements - io.grpc.stub.ServerCalls.UnaryMethod, - io.grpc.stub.ServerCalls.ServerStreamingMethod, - io.grpc.stub.ServerCalls.ClientStreamingMethod, - io.grpc.stub.ServerCalls.BidiStreamingMethod { - private final GraphInferenceServerImplBase serviceImpl; - private final int methodId; - - MethodHandlers(GraphInferenceServerImplBase serviceImpl, int methodId) { - this.serviceImpl = serviceImpl; - this.methodId = methodId; - } - - @java.lang.Override - @java.lang.SuppressWarnings("unchecked") - public void invoke(Req request, io.grpc.stub.StreamObserver responseObserver) { - switch (methodId) { - case METHODID_REGISTER_GRAPH: - serviceImpl.registerGraph((nd4j.graph.FlatGraph) request, - (io.grpc.stub.StreamObserver) responseObserver); - break; - case METHODID_FORGET_GRAPH: - serviceImpl.forgetGraph((nd4j.graph.FlatDropRequest) request, - (io.grpc.stub.StreamObserver) responseObserver); - break; - case METHODID_REPLACE_GRAPH: - serviceImpl.replaceGraph((nd4j.graph.FlatGraph) request, - (io.grpc.stub.StreamObserver) responseObserver); - break; - case METHODID_INFERENCE_REQUEST: - serviceImpl.inferenceRequest((nd4j.graph.FlatInferenceRequest) request, - (io.grpc.stub.StreamObserver) responseObserver); - break; - default: - throw new AssertionError(); - } - } - - @java.lang.Override - @java.lang.SuppressWarnings("unchecked") - public io.grpc.stub.StreamObserver invoke( - io.grpc.stub.StreamObserver responseObserver) { - switch (methodId) { - default: - throw new AssertionError(); - } - } - } - - private static volatile io.grpc.ServiceDescriptor serviceDescriptor; - - public static io.grpc.ServiceDescriptor getServiceDescriptor() { - io.grpc.ServiceDescriptor result = serviceDescriptor; - if (result == null) { - synchronized (GraphInferenceServerGrpc.class) { - result = serviceDescriptor; - if (result == null) { - serviceDescriptor = result = io.grpc.ServiceDescriptor.newBuilder(SERVICE_NAME) - .setSchemaDescriptor(null) - .addMethod(getRegisterGraphMethod()) - .addMethod(getForgetGraphMethod()) - .addMethod(getReplaceGraphMethod()) - .addMethod(getInferenceRequestMethod()) - .build(); - } - } - } - return result; - } -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/InputType.cs b/libnd4j/include/graph/generated/nd4j/graph/InputType.cs deleted file mode 100644 index 0172b846d..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/InputType.cs +++ /dev/null @@ -1,18 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum InputType : sbyte -{ - UNDEFINED = 0, - NUMERIC = 1, - STRINGULAR = 2, - NUMERIC_SET = 3, - STRINGULAR_SET = 4, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/InputType.java b/libnd4j/include/graph/generated/nd4j/graph/InputType.java deleted file mode 100644 index 158d3b803..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/InputType.java +++ /dev/null @@ -1,17 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class InputType { - private InputType() { } - public static final byte UNDEFINED = 0; - public static final byte NUMERIC = 1; - public static final byte STRINGULAR = 2; - public static final byte NUMERIC_SET = 3; - public static final byte STRINGULAR_SET = 4; - - public static final String[] names = { "UNDEFINED", "NUMERIC", "STRINGULAR", "NUMERIC_SET", "STRINGULAR_SET", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/InputType.py b/libnd4j/include/graph/generated/nd4j/graph/InputType.py deleted file mode 100644 index 0be254efe..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/InputType.py +++ /dev/null @@ -1,25 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class InputType(object): - UNDEFINED = 0 - NUMERIC = 1 - STRINGULAR = 2 - NUMERIC_SET = 3 - STRINGULAR_SET = 4 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/IntPair.cs b/libnd4j/include/graph/generated/nd4j/graph/IntPair.cs deleted file mode 100644 index 38dad82ed..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/IntPair.cs +++ /dev/null @@ -1,42 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct IntPair : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static IntPair GetRootAsIntPair(ByteBuffer _bb) { return GetRootAsIntPair(_bb, new IntPair()); } - public static IntPair GetRootAsIntPair(ByteBuffer _bb, IntPair obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public IntPair __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int First { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public int Second { get { int o = __p.__offset(6); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - - public static Offset CreateIntPair(FlatBufferBuilder builder, - int first = 0, - int second = 0) { - builder.StartObject(2); - IntPair.AddSecond(builder, second); - IntPair.AddFirst(builder, first); - return IntPair.EndIntPair(builder); - } - - public static void StartIntPair(FlatBufferBuilder builder) { builder.StartObject(2); } - public static void AddFirst(FlatBufferBuilder builder, int first) { builder.AddInt(0, first, 0); } - public static void AddSecond(FlatBufferBuilder builder, int second) { builder.AddInt(1, second, 0); } - public static Offset EndIntPair(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/IntPair.java b/libnd4j/include/graph/generated/nd4j/graph/IntPair.java deleted file mode 100644 index c988143e6..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/IntPair.java +++ /dev/null @@ -1,37 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class IntPair extends Table { - public static IntPair getRootAsIntPair(ByteBuffer _bb) { return getRootAsIntPair(_bb, new IntPair()); } - public static IntPair getRootAsIntPair(ByteBuffer _bb, IntPair obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public IntPair __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int first() { int o = __offset(4); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public int second() { int o = __offset(6); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - - public static int createIntPair(FlatBufferBuilder builder, - int first, - int second) { - builder.startObject(2); - IntPair.addSecond(builder, second); - IntPair.addFirst(builder, first); - return IntPair.endIntPair(builder); - } - - public static void startIntPair(FlatBufferBuilder builder) { builder.startObject(2); } - public static void addFirst(FlatBufferBuilder builder, int first) { builder.addInt(0, first, 0); } - public static void addSecond(FlatBufferBuilder builder, int second) { builder.addInt(1, second, 0); } - public static int endIntPair(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/IntPair.py b/libnd4j/include/graph/generated/nd4j/graph/IntPair.py deleted file mode 100644 index 0e402cfe5..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/IntPair.py +++ /dev/null @@ -1,52 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class IntPair(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsIntPair(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = IntPair() - x.Init(buf, n + offset) - return x - - # IntPair - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # IntPair - def First(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # IntPair - def Second(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - -def IntPairStart(builder): builder.StartObject(2) -def IntPairAddFirst(builder, first): builder.PrependInt32Slot(0, first, 0) -def IntPairAddSecond(builder, second): builder.PrependInt32Slot(1, second, 0) -def IntPairEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/IntTriple.cs b/libnd4j/include/graph/generated/nd4j/graph/IntTriple.cs deleted file mode 100644 index e1c5ac1a4..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/IntTriple.cs +++ /dev/null @@ -1,46 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct IntTriple : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static IntTriple GetRootAsIntTriple(ByteBuffer _bb) { return GetRootAsIntTriple(_bb, new IntTriple()); } - public static IntTriple GetRootAsIntTriple(ByteBuffer _bb, IntTriple obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public IntTriple __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int First { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public int Second { get { int o = __p.__offset(6); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public int Third { get { int o = __p.__offset(8); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - - public static Offset CreateIntTriple(FlatBufferBuilder builder, - int first = 0, - int second = 0, - int third = 0) { - builder.StartObject(3); - IntTriple.AddThird(builder, third); - IntTriple.AddSecond(builder, second); - IntTriple.AddFirst(builder, first); - return IntTriple.EndIntTriple(builder); - } - - public static void StartIntTriple(FlatBufferBuilder builder) { builder.StartObject(3); } - public static void AddFirst(FlatBufferBuilder builder, int first) { builder.AddInt(0, first, 0); } - public static void AddSecond(FlatBufferBuilder builder, int second) { builder.AddInt(1, second, 0); } - public static void AddThird(FlatBufferBuilder builder, int third) { builder.AddInt(2, third, 0); } - public static Offset EndIntTriple(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/IntTriple.java b/libnd4j/include/graph/generated/nd4j/graph/IntTriple.java deleted file mode 100644 index 8bc8961c8..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/IntTriple.java +++ /dev/null @@ -1,41 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class IntTriple extends Table { - public static IntTriple getRootAsIntTriple(ByteBuffer _bb) { return getRootAsIntTriple(_bb, new IntTriple()); } - public static IntTriple getRootAsIntTriple(ByteBuffer _bb, IntTriple obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public IntTriple __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int first() { int o = __offset(4); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public int second() { int o = __offset(6); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public int third() { int o = __offset(8); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - - public static int createIntTriple(FlatBufferBuilder builder, - int first, - int second, - int third) { - builder.startObject(3); - IntTriple.addThird(builder, third); - IntTriple.addSecond(builder, second); - IntTriple.addFirst(builder, first); - return IntTriple.endIntTriple(builder); - } - - public static void startIntTriple(FlatBufferBuilder builder) { builder.startObject(3); } - public static void addFirst(FlatBufferBuilder builder, int first) { builder.addInt(0, first, 0); } - public static void addSecond(FlatBufferBuilder builder, int second) { builder.addInt(1, second, 0); } - public static void addThird(FlatBufferBuilder builder, int third) { builder.addInt(2, third, 0); } - public static int endIntTriple(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/IntTriple.py b/libnd4j/include/graph/generated/nd4j/graph/IntTriple.py deleted file mode 100644 index 387164f95..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/IntTriple.py +++ /dev/null @@ -1,60 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class IntTriple(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsIntTriple(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = IntTriple() - x.Init(buf, n + offset) - return x - - # IntTriple - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # IntTriple - def First(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # IntTriple - def Second(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # IntTriple - def Third(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - -def IntTripleStart(builder): builder.StartObject(3) -def IntTripleAddFirst(builder, first): builder.PrependInt32Slot(0, first, 0) -def IntTripleAddSecond(builder, second): builder.PrependInt32Slot(1, second, 0) -def IntTripleAddThird(builder, third): builder.PrependInt32Slot(2, third, 0) -def IntTripleEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/LongPair.cs b/libnd4j/include/graph/generated/nd4j/graph/LongPair.cs deleted file mode 100644 index b28c9bd0b..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/LongPair.cs +++ /dev/null @@ -1,42 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct LongPair : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static LongPair GetRootAsLongPair(ByteBuffer _bb) { return GetRootAsLongPair(_bb, new LongPair()); } - public static LongPair GetRootAsLongPair(ByteBuffer _bb, LongPair obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public LongPair __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long First { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long Second { get { int o = __p.__offset(6); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - - public static Offset CreateLongPair(FlatBufferBuilder builder, - long first = 0, - long second = 0) { - builder.StartObject(2); - LongPair.AddSecond(builder, second); - LongPair.AddFirst(builder, first); - return LongPair.EndLongPair(builder); - } - - public static void StartLongPair(FlatBufferBuilder builder) { builder.StartObject(2); } - public static void AddFirst(FlatBufferBuilder builder, long first) { builder.AddLong(0, first, 0); } - public static void AddSecond(FlatBufferBuilder builder, long second) { builder.AddLong(1, second, 0); } - public static Offset EndLongPair(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/LongPair.java b/libnd4j/include/graph/generated/nd4j/graph/LongPair.java deleted file mode 100644 index e17c019f6..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/LongPair.java +++ /dev/null @@ -1,37 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class LongPair extends Table { - public static LongPair getRootAsLongPair(ByteBuffer _bb) { return getRootAsLongPair(_bb, new LongPair()); } - public static LongPair getRootAsLongPair(ByteBuffer _bb, LongPair obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public LongPair __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long first() { int o = __offset(4); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long second() { int o = __offset(6); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - - public static int createLongPair(FlatBufferBuilder builder, - long first, - long second) { - builder.startObject(2); - LongPair.addSecond(builder, second); - LongPair.addFirst(builder, first); - return LongPair.endLongPair(builder); - } - - public static void startLongPair(FlatBufferBuilder builder) { builder.startObject(2); } - public static void addFirst(FlatBufferBuilder builder, long first) { builder.addLong(0, first, 0L); } - public static void addSecond(FlatBufferBuilder builder, long second) { builder.addLong(1, second, 0L); } - public static int endLongPair(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/LongPair.py b/libnd4j/include/graph/generated/nd4j/graph/LongPair.py deleted file mode 100644 index 8163865d5..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/LongPair.py +++ /dev/null @@ -1,52 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class LongPair(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsLongPair(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = LongPair() - x.Init(buf, n + offset) - return x - - # LongPair - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # LongPair - def First(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # LongPair - def Second(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - -def LongPairStart(builder): builder.StartObject(2) -def LongPairAddFirst(builder, first): builder.PrependInt64Slot(0, first, 0) -def LongPairAddSecond(builder, second): builder.PrependInt64Slot(1, second, 0) -def LongPairEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/LongTriple.cs b/libnd4j/include/graph/generated/nd4j/graph/LongTriple.cs deleted file mode 100644 index dc49143c3..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/LongTriple.cs +++ /dev/null @@ -1,46 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct LongTriple : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static LongTriple GetRootAsLongTriple(ByteBuffer _bb) { return GetRootAsLongTriple(_bb, new LongTriple()); } - public static LongTriple GetRootAsLongTriple(ByteBuffer _bb, LongTriple obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public LongTriple __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long First { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long Second { get { int o = __p.__offset(6); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long Third { get { int o = __p.__offset(8); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - - public static Offset CreateLongTriple(FlatBufferBuilder builder, - long first = 0, - long second = 0, - long third = 0) { - builder.StartObject(3); - LongTriple.AddThird(builder, third); - LongTriple.AddSecond(builder, second); - LongTriple.AddFirst(builder, first); - return LongTriple.EndLongTriple(builder); - } - - public static void StartLongTriple(FlatBufferBuilder builder) { builder.StartObject(3); } - public static void AddFirst(FlatBufferBuilder builder, long first) { builder.AddLong(0, first, 0); } - public static void AddSecond(FlatBufferBuilder builder, long second) { builder.AddLong(1, second, 0); } - public static void AddThird(FlatBufferBuilder builder, long third) { builder.AddLong(2, third, 0); } - public static Offset EndLongTriple(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/LongTriple.java b/libnd4j/include/graph/generated/nd4j/graph/LongTriple.java deleted file mode 100644 index c35e27f4c..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/LongTriple.java +++ /dev/null @@ -1,41 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class LongTriple extends Table { - public static LongTriple getRootAsLongTriple(ByteBuffer _bb) { return getRootAsLongTriple(_bb, new LongTriple()); } - public static LongTriple getRootAsLongTriple(ByteBuffer _bb, LongTriple obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public LongTriple __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long first() { int o = __offset(4); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long second() { int o = __offset(6); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long third() { int o = __offset(8); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - - public static int createLongTriple(FlatBufferBuilder builder, - long first, - long second, - long third) { - builder.startObject(3); - LongTriple.addThird(builder, third); - LongTriple.addSecond(builder, second); - LongTriple.addFirst(builder, first); - return LongTriple.endLongTriple(builder); - } - - public static void startLongTriple(FlatBufferBuilder builder) { builder.startObject(3); } - public static void addFirst(FlatBufferBuilder builder, long first) { builder.addLong(0, first, 0L); } - public static void addSecond(FlatBufferBuilder builder, long second) { builder.addLong(1, second, 0L); } - public static void addThird(FlatBufferBuilder builder, long third) { builder.addLong(2, third, 0L); } - public static int endLongTriple(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/LongTriple.py b/libnd4j/include/graph/generated/nd4j/graph/LongTriple.py deleted file mode 100644 index 18261a9cc..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/LongTriple.py +++ /dev/null @@ -1,60 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class LongTriple(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsLongTriple(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = LongTriple() - x.Init(buf, n + offset) - return x - - # LongTriple - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # LongTriple - def First(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # LongTriple - def Second(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # LongTriple - def Third(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - -def LongTripleStart(builder): builder.StartObject(3) -def LongTripleAddFirst(builder, first): builder.PrependInt64Slot(0, first, 0) -def LongTripleAddSecond(builder, second): builder.PrependInt64Slot(1, second, 0) -def LongTripleAddThird(builder, third): builder.PrependInt64Slot(2, third, 0) -def LongTripleEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/OpClass.cs b/libnd4j/include/graph/generated/nd4j/graph/OpClass.cs deleted file mode 100644 index 45ac68a3a..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OpClass.cs +++ /dev/null @@ -1,19 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum OpClass : sbyte -{ - TRANSFORM = 0, - REDUCTION = 1, - MULTIPLICATOR = 2, - GRAPH = 3, - CONDITIONAL = 4, - LOOP = 5, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/OpClass.java b/libnd4j/include/graph/generated/nd4j/graph/OpClass.java deleted file mode 100644 index 996009041..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OpClass.java +++ /dev/null @@ -1,18 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class OpClass { - private OpClass() { } - public static final byte TRANSFORM = 0; - public static final byte REDUCTION = 1; - public static final byte MULTIPLICATOR = 2; - public static final byte GRAPH = 3; - public static final byte CONDITIONAL = 4; - public static final byte LOOP = 5; - - public static final String[] names = { "TRANSFORM", "REDUCTION", "MULTIPLICATOR", "GRAPH", "CONDITIONAL", "LOOP", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/OpClass.py b/libnd4j/include/graph/generated/nd4j/graph/OpClass.py deleted file mode 100644 index 7556a5209..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OpClass.py +++ /dev/null @@ -1,26 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class OpClass(object): - TRANSFORM = 0 - REDUCTION = 1 - MULTIPLICATOR = 2 - GRAPH = 3 - CONDITIONAL = 4 - LOOP = 5 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/OpType.cs b/libnd4j/include/graph/generated/nd4j/graph/OpType.cs deleted file mode 100644 index 2b9d27bce..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OpType.cs +++ /dev/null @@ -1,39 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum OpType : sbyte -{ - TRANSFORM_FLOAT = 0, - TRANSFORM_SAME = 1, - TRANSFORM_BOOL = 2, - TRANSFORM_STRICT = 3, - TRANSFORM_ANY = 4, - REDUCE_FLOAT = 5, - REDUCE_SAME = 6, - REDUCE_LONG = 7, - REDUCE_BOOL = 8, - INDEX_REDUCE = 9, - SCALAR = 10, - SCALAR_BOOL = 11, - BROADCAST = 12, - BROADCAST_BOOL = 13, - PAIRWISE = 14, - PAIRWISE_BOOL = 15, - REDUCE_3 = 16, - SUMMARYSTATS = 17, - SHAPE = 18, - AGGREGATION = 19, - RANDOM = 20, - CUSTOM = 21, - GRAPH = 22, - VARIABLE = 40, - BOOLEAN = 60, - LOGIC = 119, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/OpType.java b/libnd4j/include/graph/generated/nd4j/graph/OpType.java deleted file mode 100644 index bac8509d8..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OpType.java +++ /dev/null @@ -1,38 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class OpType { - private OpType() { } - public static final byte TRANSFORM_FLOAT = 0; - public static final byte TRANSFORM_SAME = 1; - public static final byte TRANSFORM_BOOL = 2; - public static final byte TRANSFORM_STRICT = 3; - public static final byte TRANSFORM_ANY = 4; - public static final byte REDUCE_FLOAT = 5; - public static final byte REDUCE_SAME = 6; - public static final byte REDUCE_LONG = 7; - public static final byte REDUCE_BOOL = 8; - public static final byte INDEX_REDUCE = 9; - public static final byte SCALAR = 10; - public static final byte SCALAR_BOOL = 11; - public static final byte BROADCAST = 12; - public static final byte BROADCAST_BOOL = 13; - public static final byte PAIRWISE = 14; - public static final byte PAIRWISE_BOOL = 15; - public static final byte REDUCE_3 = 16; - public static final byte SUMMARYSTATS = 17; - public static final byte SHAPE = 18; - public static final byte AGGREGATION = 19; - public static final byte RANDOM = 20; - public static final byte CUSTOM = 21; - public static final byte GRAPH = 22; - public static final byte VARIABLE = 40; - public static final byte BOOLEAN = 60; - public static final byte LOGIC = 119; - - public static final String[] names = { "TRANSFORM_FLOAT", "TRANSFORM_SAME", "TRANSFORM_BOOL", "TRANSFORM_STRICT", "TRANSFORM_ANY", "REDUCE_FLOAT", "REDUCE_SAME", "REDUCE_LONG", "REDUCE_BOOL", "INDEX_REDUCE", "SCALAR", "SCALAR_BOOL", "BROADCAST", "BROADCAST_BOOL", "PAIRWISE", "PAIRWISE_BOOL", "REDUCE_3", "SUMMARYSTATS", "SHAPE", "AGGREGATION", "RANDOM", "CUSTOM", "GRAPH", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "VARIABLE", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "BOOLEAN", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "LOGIC", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/OpType.py b/libnd4j/include/graph/generated/nd4j/graph/OpType.py deleted file mode 100644 index dba740a45..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OpType.py +++ /dev/null @@ -1,46 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class OpType(object): - TRANSFORM_FLOAT = 0 - TRANSFORM_SAME = 1 - TRANSFORM_BOOL = 2 - TRANSFORM_STRICT = 3 - TRANSFORM_ANY = 4 - REDUCE_FLOAT = 5 - REDUCE_SAME = 6 - REDUCE_LONG = 7 - REDUCE_BOOL = 8 - INDEX_REDUCE = 9 - SCALAR = 10 - SCALAR_BOOL = 11 - BROADCAST = 12 - BROADCAST_BOOL = 13 - PAIRWISE = 14 - PAIRWISE_BOOL = 15 - REDUCE_3 = 16 - SUMMARYSTATS = 17 - SHAPE = 18 - AGGREGATION = 19 - RANDOM = 20 - CUSTOM = 21 - GRAPH = 22 - VARIABLE = 40 - BOOLEAN = 60 - LOGIC = 119 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/OutputMode.cs b/libnd4j/include/graph/generated/nd4j/graph/OutputMode.cs deleted file mode 100644 index 4cd916dbe..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OutputMode.cs +++ /dev/null @@ -1,18 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum OutputMode : sbyte -{ - IMPLICIT = 0, - EXPLICIT = 1, - EXPLICIT_AND_IMPLICIT = 2, - VARIABLE_SPACE = 3, - OPTIMIZED = 4, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/OutputMode.java b/libnd4j/include/graph/generated/nd4j/graph/OutputMode.java deleted file mode 100644 index 9413825e2..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OutputMode.java +++ /dev/null @@ -1,17 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class OutputMode { - private OutputMode() { } - public static final byte IMPLICIT = 0; - public static final byte EXPLICIT = 1; - public static final byte EXPLICIT_AND_IMPLICIT = 2; - public static final byte VARIABLE_SPACE = 3; - public static final byte OPTIMIZED = 4; - - public static final String[] names = { "IMPLICIT", "EXPLICIT", "EXPLICIT_AND_IMPLICIT", "VARIABLE_SPACE", "OPTIMIZED", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/OutputMode.py b/libnd4j/include/graph/generated/nd4j/graph/OutputMode.py deleted file mode 100644 index 9fc133956..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/OutputMode.py +++ /dev/null @@ -1,25 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class OutputMode(object): - IMPLICIT = 0 - EXPLICIT = 1 - EXPLICIT_AND_IMPLICIT = 2 - VARIABLE_SPACE = 3 - OPTIMIZED = 4 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.cs b/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.cs deleted file mode 100644 index 1d9b90ca3..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.cs +++ /dev/null @@ -1,17 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum ProfilingMode : sbyte -{ - NONE = 0, - NAN_PANIC = 1, - INF_PANIC = 2, - ANY_PANIC = 3, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.java b/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.java deleted file mode 100644 index 34e3e320f..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.java +++ /dev/null @@ -1,16 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class ProfilingMode { - private ProfilingMode() { } - public static final byte NONE = 0; - public static final byte NAN_PANIC = 1; - public static final byte INF_PANIC = 2; - public static final byte ANY_PANIC = 3; - - public static final String[] names = { "NONE", "NAN_PANIC", "INF_PANIC", "ANY_PANIC", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.py b/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.py deleted file mode 100644 index bfa3bcab0..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/ProfilingMode.py +++ /dev/null @@ -1,24 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class ProfilingMode(object): - NONE = 0 - NAN_PANIC = 1 - INF_PANIC = 2 - ANY_PANIC = 3 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIAddName.cs b/libnd4j/include/graph/generated/nd4j/graph/UIAddName.cs deleted file mode 100644 index cc2f6fb85..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIAddName.cs +++ /dev/null @@ -1,48 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UIAddName : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UIAddName GetRootAsUIAddName(ByteBuffer _bb) { return GetRootAsUIAddName(_bb, new UIAddName()); } - public static UIAddName GetRootAsUIAddName(ByteBuffer _bb, UIAddName obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UIAddName __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int NameIdx { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public string Name { get { int o = __p.__offset(6); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetNameBytes() { return __p.__vector_as_span(6); } -#else - public ArraySegment? GetNameBytes() { return __p.__vector_as_arraysegment(6); } -#endif - public byte[] GetNameArray() { return __p.__vector_as_array(6); } - - public static Offset CreateUIAddName(FlatBufferBuilder builder, - int nameIdx = 0, - StringOffset nameOffset = default(StringOffset)) { - builder.StartObject(2); - UIAddName.AddName(builder, nameOffset); - UIAddName.AddNameIdx(builder, nameIdx); - return UIAddName.EndUIAddName(builder); - } - - public static void StartUIAddName(FlatBufferBuilder builder) { builder.StartObject(2); } - public static void AddNameIdx(FlatBufferBuilder builder, int nameIdx) { builder.AddInt(0, nameIdx, 0); } - public static void AddName(FlatBufferBuilder builder, StringOffset nameOffset) { builder.AddOffset(1, nameOffset.Value, 0); } - public static Offset EndUIAddName(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIAddName.java b/libnd4j/include/graph/generated/nd4j/graph/UIAddName.java deleted file mode 100644 index 9caf5f0d7..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIAddName.java +++ /dev/null @@ -1,39 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UIAddName extends Table { - public static UIAddName getRootAsUIAddName(ByteBuffer _bb) { return getRootAsUIAddName(_bb, new UIAddName()); } - public static UIAddName getRootAsUIAddName(ByteBuffer _bb, UIAddName obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UIAddName __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int nameIdx() { int o = __offset(4); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public String name() { int o = __offset(6); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer nameAsByteBuffer() { return __vector_as_bytebuffer(6, 1); } - public ByteBuffer nameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 6, 1); } - - public static int createUIAddName(FlatBufferBuilder builder, - int nameIdx, - int nameOffset) { - builder.startObject(2); - UIAddName.addName(builder, nameOffset); - UIAddName.addNameIdx(builder, nameIdx); - return UIAddName.endUIAddName(builder); - } - - public static void startUIAddName(FlatBufferBuilder builder) { builder.startObject(2); } - public static void addNameIdx(FlatBufferBuilder builder, int nameIdx) { builder.addInt(0, nameIdx, 0); } - public static void addName(FlatBufferBuilder builder, int nameOffset) { builder.addOffset(1, nameOffset, 0); } - public static int endUIAddName(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIAddName.py b/libnd4j/include/graph/generated/nd4j/graph/UIAddName.py deleted file mode 100644 index 3b8be72d7..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIAddName.py +++ /dev/null @@ -1,52 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UIAddName(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUIAddName(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UIAddName() - x.Init(buf, n + offset) - return x - - # UIAddName - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UIAddName - def NameIdx(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # UIAddName - def Name(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - -def UIAddNameStart(builder): builder.StartObject(2) -def UIAddNameAddNameIdx(builder, nameIdx): builder.PrependInt32Slot(0, nameIdx, 0) -def UIAddNameAddName(builder, name): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(name), 0) -def UIAddNameEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEvent.cs b/libnd4j/include/graph/generated/nd4j/graph/UIEvent.cs deleted file mode 100644 index a8775d67d..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEvent.cs +++ /dev/null @@ -1,70 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UIEvent : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UIEvent GetRootAsUIEvent(ByteBuffer _bb) { return GetRootAsUIEvent(_bb, new UIEvent()); } - public static UIEvent GetRootAsUIEvent(ByteBuffer _bb, UIEvent obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UIEvent __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public UIEventType EventType { get { int o = __p.__offset(4); return o != 0 ? (UIEventType)__p.bb.GetSbyte(o + __p.bb_pos) : UIEventType.ADD_NAME; } } - public UIEventSubtype EventSubType { get { int o = __p.__offset(6); return o != 0 ? (UIEventSubtype)__p.bb.GetSbyte(o + __p.bb_pos) : UIEventSubtype.NONE; } } - public int NameIdx { get { int o = __p.__offset(8); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public long Timestamp { get { int o = __p.__offset(10); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public int Iteration { get { int o = __p.__offset(12); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public int Epoch { get { int o = __p.__offset(14); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - public short VariableId { get { int o = __p.__offset(16); return o != 0 ? __p.bb.GetShort(o + __p.bb_pos) : (short)0; } } - public FrameIteration? FrameIter { get { int o = __p.__offset(18); return o != 0 ? (FrameIteration?)(new FrameIteration()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public ushort Plugin { get { int o = __p.__offset(20); return o != 0 ? __p.bb.GetUshort(o + __p.bb_pos) : (ushort)0; } } - - public static Offset CreateUIEvent(FlatBufferBuilder builder, - UIEventType eventType = UIEventType.ADD_NAME, - UIEventSubtype eventSubType = UIEventSubtype.NONE, - int nameIdx = 0, - long timestamp = 0, - int iteration = 0, - int epoch = 0, - short variableId = 0, - Offset frameIterOffset = default(Offset), - ushort plugin = 0) { - builder.StartObject(9); - UIEvent.AddTimestamp(builder, timestamp); - UIEvent.AddFrameIter(builder, frameIterOffset); - UIEvent.AddEpoch(builder, epoch); - UIEvent.AddIteration(builder, iteration); - UIEvent.AddNameIdx(builder, nameIdx); - UIEvent.AddPlugin(builder, plugin); - UIEvent.AddVariableId(builder, variableId); - UIEvent.AddEventSubType(builder, eventSubType); - UIEvent.AddEventType(builder, eventType); - return UIEvent.EndUIEvent(builder); - } - - public static void StartUIEvent(FlatBufferBuilder builder) { builder.StartObject(9); } - public static void AddEventType(FlatBufferBuilder builder, UIEventType eventType) { builder.AddSbyte(0, (sbyte)eventType, 0); } - public static void AddEventSubType(FlatBufferBuilder builder, UIEventSubtype eventSubType) { builder.AddSbyte(1, (sbyte)eventSubType, 0); } - public static void AddNameIdx(FlatBufferBuilder builder, int nameIdx) { builder.AddInt(2, nameIdx, 0); } - public static void AddTimestamp(FlatBufferBuilder builder, long timestamp) { builder.AddLong(3, timestamp, 0); } - public static void AddIteration(FlatBufferBuilder builder, int iteration) { builder.AddInt(4, iteration, 0); } - public static void AddEpoch(FlatBufferBuilder builder, int epoch) { builder.AddInt(5, epoch, 0); } - public static void AddVariableId(FlatBufferBuilder builder, short variableId) { builder.AddShort(6, variableId, 0); } - public static void AddFrameIter(FlatBufferBuilder builder, Offset frameIterOffset) { builder.AddOffset(7, frameIterOffset.Value, 0); } - public static void AddPlugin(FlatBufferBuilder builder, ushort plugin) { builder.AddUshort(8, plugin, 0); } - public static Offset EndUIEvent(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEvent.java b/libnd4j/include/graph/generated/nd4j/graph/UIEvent.java deleted file mode 100644 index e586e8967..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEvent.java +++ /dev/null @@ -1,66 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UIEvent extends Table { - public static UIEvent getRootAsUIEvent(ByteBuffer _bb) { return getRootAsUIEvent(_bb, new UIEvent()); } - public static UIEvent getRootAsUIEvent(ByteBuffer _bb, UIEvent obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UIEvent __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public byte eventType() { int o = __offset(4); return o != 0 ? bb.get(o + bb_pos) : 0; } - public byte eventSubType() { int o = __offset(6); return o != 0 ? bb.get(o + bb_pos) : 0; } - public int nameIdx() { int o = __offset(8); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public long timestamp() { int o = __offset(10); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public int iteration() { int o = __offset(12); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public int epoch() { int o = __offset(14); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - public short variableId() { int o = __offset(16); return o != 0 ? bb.getShort(o + bb_pos) : 0; } - public FrameIteration frameIter() { return frameIter(new FrameIteration()); } - public FrameIteration frameIter(FrameIteration obj) { int o = __offset(18); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public int plugin() { int o = __offset(20); return o != 0 ? bb.getShort(o + bb_pos) & 0xFFFF : 0; } - - public static int createUIEvent(FlatBufferBuilder builder, - byte eventType, - byte eventSubType, - int nameIdx, - long timestamp, - int iteration, - int epoch, - short variableId, - int frameIterOffset, - int plugin) { - builder.startObject(9); - UIEvent.addTimestamp(builder, timestamp); - UIEvent.addFrameIter(builder, frameIterOffset); - UIEvent.addEpoch(builder, epoch); - UIEvent.addIteration(builder, iteration); - UIEvent.addNameIdx(builder, nameIdx); - UIEvent.addPlugin(builder, plugin); - UIEvent.addVariableId(builder, variableId); - UIEvent.addEventSubType(builder, eventSubType); - UIEvent.addEventType(builder, eventType); - return UIEvent.endUIEvent(builder); - } - - public static void startUIEvent(FlatBufferBuilder builder) { builder.startObject(9); } - public static void addEventType(FlatBufferBuilder builder, byte eventType) { builder.addByte(0, eventType, 0); } - public static void addEventSubType(FlatBufferBuilder builder, byte eventSubType) { builder.addByte(1, eventSubType, 0); } - public static void addNameIdx(FlatBufferBuilder builder, int nameIdx) { builder.addInt(2, nameIdx, 0); } - public static void addTimestamp(FlatBufferBuilder builder, long timestamp) { builder.addLong(3, timestamp, 0L); } - public static void addIteration(FlatBufferBuilder builder, int iteration) { builder.addInt(4, iteration, 0); } - public static void addEpoch(FlatBufferBuilder builder, int epoch) { builder.addInt(5, epoch, 0); } - public static void addVariableId(FlatBufferBuilder builder, short variableId) { builder.addShort(6, variableId, 0); } - public static void addFrameIter(FlatBufferBuilder builder, int frameIterOffset) { builder.addOffset(7, frameIterOffset, 0); } - public static void addPlugin(FlatBufferBuilder builder, int plugin) { builder.addShort(8, (short)plugin, (short)0); } - public static int endUIEvent(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEvent.py b/libnd4j/include/graph/generated/nd4j/graph/UIEvent.py deleted file mode 100644 index f52bdf6cf..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEvent.py +++ /dev/null @@ -1,112 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UIEvent(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUIEvent(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UIEvent() - x.Init(buf, n + offset) - return x - - # UIEvent - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UIEvent - def EventType(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # UIEvent - def EventSubType(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # UIEvent - def NameIdx(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # UIEvent - def Timestamp(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # UIEvent - def Iteration(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # UIEvent - def Epoch(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - - # UIEvent - def VariableId(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int16Flags, o + self._tab.Pos) - return 0 - - # UIEvent - def FrameIter(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FrameIteration import FrameIteration - obj = FrameIteration() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UIEvent - def Plugin(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Uint16Flags, o + self._tab.Pos) - return 0 - -def UIEventStart(builder): builder.StartObject(9) -def UIEventAddEventType(builder, eventType): builder.PrependInt8Slot(0, eventType, 0) -def UIEventAddEventSubType(builder, eventSubType): builder.PrependInt8Slot(1, eventSubType, 0) -def UIEventAddNameIdx(builder, nameIdx): builder.PrependInt32Slot(2, nameIdx, 0) -def UIEventAddTimestamp(builder, timestamp): builder.PrependInt64Slot(3, timestamp, 0) -def UIEventAddIteration(builder, iteration): builder.PrependInt32Slot(4, iteration, 0) -def UIEventAddEpoch(builder, epoch): builder.PrependInt32Slot(5, epoch, 0) -def UIEventAddVariableId(builder, variableId): builder.PrependInt16Slot(6, variableId, 0) -def UIEventAddFrameIter(builder, frameIter): builder.PrependUOffsetTRelativeSlot(7, flatbuffers.number_types.UOffsetTFlags.py_type(frameIter), 0) -def UIEventAddPlugin(builder, plugin): builder.PrependUint16Slot(8, plugin, 0) -def UIEventEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.cs b/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.cs deleted file mode 100644 index d4da48a25..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.cs +++ /dev/null @@ -1,23 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum UIEventSubtype : sbyte -{ - NONE = 0, - EVALUATION = 1, - LOSS = 2, - LEARNING_RATE = 3, - TUNING_METRIC = 4, - PERFORMANCE = 5, - PROFILING = 6, - FEATURE_LABEL = 7, - PREDICTION = 8, - USER_CUSTOM = 9, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.java b/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.java deleted file mode 100644 index 98a9c5951..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.java +++ /dev/null @@ -1,22 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class UIEventSubtype { - private UIEventSubtype() { } - public static final byte NONE = 0; - public static final byte EVALUATION = 1; - public static final byte LOSS = 2; - public static final byte LEARNING_RATE = 3; - public static final byte TUNING_METRIC = 4; - public static final byte PERFORMANCE = 5; - public static final byte PROFILING = 6; - public static final byte FEATURE_LABEL = 7; - public static final byte PREDICTION = 8; - public static final byte USER_CUSTOM = 9; - - public static final String[] names = { "NONE", "EVALUATION", "LOSS", "LEARNING_RATE", "TUNING_METRIC", "PERFORMANCE", "PROFILING", "FEATURE_LABEL", "PREDICTION", "USER_CUSTOM", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.py b/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.py deleted file mode 100644 index b1f6e9c79..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEventSubtype.py +++ /dev/null @@ -1,30 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class UIEventSubtype(object): - NONE = 0 - EVALUATION = 1 - LOSS = 2 - LEARNING_RATE = 3 - TUNING_METRIC = 4 - PERFORMANCE = 5 - PROFILING = 6 - FEATURE_LABEL = 7 - PREDICTION = 8 - USER_CUSTOM = 9 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEventType.cs b/libnd4j/include/graph/generated/nd4j/graph/UIEventType.cs deleted file mode 100644 index 32427dac2..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEventType.cs +++ /dev/null @@ -1,22 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum UIEventType : sbyte -{ - ADD_NAME = 0, - SCALAR = 1, - ARRAY = 2, - ARRAY_LIST = 3, - HISTOGRAM = 4, - IMAGE = 5, - SUMMARY_STATISTICS = 6, - OP_TIMING = 7, - HARDWARE_STATE = 8, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEventType.java b/libnd4j/include/graph/generated/nd4j/graph/UIEventType.java deleted file mode 100644 index 0c5f38c2b..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEventType.java +++ /dev/null @@ -1,21 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class UIEventType { - private UIEventType() { } - public static final byte ADD_NAME = 0; - public static final byte SCALAR = 1; - public static final byte ARRAY = 2; - public static final byte ARRAY_LIST = 3; - public static final byte HISTOGRAM = 4; - public static final byte IMAGE = 5; - public static final byte SUMMARY_STATISTICS = 6; - public static final byte OP_TIMING = 7; - public static final byte HARDWARE_STATE = 8; - - public static final String[] names = { "ADD_NAME", "SCALAR", "ARRAY", "ARRAY_LIST", "HISTOGRAM", "IMAGE", "SUMMARY_STATISTICS", "OP_TIMING", "HARDWARE_STATE", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIEventType.py b/libnd4j/include/graph/generated/nd4j/graph/UIEventType.py deleted file mode 100644 index 63a665a09..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIEventType.py +++ /dev/null @@ -1,29 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class UIEventType(object): - ADD_NAME = 0 - SCALAR = 1 - ARRAY = 2 - ARRAY_LIST = 3 - HISTOGRAM = 4 - IMAGE = 5 - SUMMARY_STATISTICS = 6 - OP_TIMING = 7 - HARDWARE_STATE = 8 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.cs b/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.cs deleted file mode 100644 index ad8e16ab9..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.cs +++ /dev/null @@ -1,74 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UIGraphStructure : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UIGraphStructure GetRootAsUIGraphStructure(ByteBuffer _bb) { return GetRootAsUIGraphStructure(_bb, new UIGraphStructure()); } - public static UIGraphStructure GetRootAsUIGraphStructure(ByteBuffer _bb, UIGraphStructure obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UIGraphStructure __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public string Inputs(int j) { int o = __p.__offset(4); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int InputsLength { get { int o = __p.__offset(4); return o != 0 ? __p.__vector_len(o) : 0; } } - public IntPair? InputsPair(int j) { int o = __p.__offset(6); return o != 0 ? (IntPair?)(new IntPair()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int InputsPairLength { get { int o = __p.__offset(6); return o != 0 ? __p.__vector_len(o) : 0; } } - public string Outputs(int j) { int o = __p.__offset(8); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int OutputsLength { get { int o = __p.__offset(8); return o != 0 ? __p.__vector_len(o) : 0; } } - public UIVariable? Variables(int j) { int o = __p.__offset(10); return o != 0 ? (UIVariable?)(new UIVariable()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int VariablesLength { get { int o = __p.__offset(10); return o != 0 ? __p.__vector_len(o) : 0; } } - public UIOp? Ops(int j) { int o = __p.__offset(12); return o != 0 ? (UIOp?)(new UIOp()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int OpsLength { get { int o = __p.__offset(12); return o != 0 ? __p.__vector_len(o) : 0; } } - - public static Offset CreateUIGraphStructure(FlatBufferBuilder builder, - VectorOffset inputsOffset = default(VectorOffset), - VectorOffset inputsPairOffset = default(VectorOffset), - VectorOffset outputsOffset = default(VectorOffset), - VectorOffset variablesOffset = default(VectorOffset), - VectorOffset opsOffset = default(VectorOffset)) { - builder.StartObject(5); - UIGraphStructure.AddOps(builder, opsOffset); - UIGraphStructure.AddVariables(builder, variablesOffset); - UIGraphStructure.AddOutputs(builder, outputsOffset); - UIGraphStructure.AddInputsPair(builder, inputsPairOffset); - UIGraphStructure.AddInputs(builder, inputsOffset); - return UIGraphStructure.EndUIGraphStructure(builder); - } - - public static void StartUIGraphStructure(FlatBufferBuilder builder) { builder.StartObject(5); } - public static void AddInputs(FlatBufferBuilder builder, VectorOffset inputsOffset) { builder.AddOffset(0, inputsOffset.Value, 0); } - public static VectorOffset CreateInputsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateInputsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartInputsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddInputsPair(FlatBufferBuilder builder, VectorOffset inputsPairOffset) { builder.AddOffset(1, inputsPairOffset.Value, 0); } - public static VectorOffset CreateInputsPairVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateInputsPairVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartInputsPairVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddOutputs(FlatBufferBuilder builder, VectorOffset outputsOffset) { builder.AddOffset(2, outputsOffset.Value, 0); } - public static VectorOffset CreateOutputsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateOutputsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartOutputsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddVariables(FlatBufferBuilder builder, VectorOffset variablesOffset) { builder.AddOffset(3, variablesOffset.Value, 0); } - public static VectorOffset CreateVariablesVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateVariablesVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartVariablesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddOps(FlatBufferBuilder builder, VectorOffset opsOffset) { builder.AddOffset(4, opsOffset.Value, 0); } - public static VectorOffset CreateOpsVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateOpsVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartOpsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static Offset EndUIGraphStructure(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.java b/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.java deleted file mode 100644 index 65ff2d58b..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.java +++ /dev/null @@ -1,67 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UIGraphStructure extends Table { - public static UIGraphStructure getRootAsUIGraphStructure(ByteBuffer _bb) { return getRootAsUIGraphStructure(_bb, new UIGraphStructure()); } - public static UIGraphStructure getRootAsUIGraphStructure(ByteBuffer _bb, UIGraphStructure obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UIGraphStructure __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public String inputs(int j) { int o = __offset(4); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int inputsLength() { int o = __offset(4); return o != 0 ? __vector_len(o) : 0; } - public IntPair inputsPair(int j) { return inputsPair(new IntPair(), j); } - public IntPair inputsPair(IntPair obj, int j) { int o = __offset(6); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int inputsPairLength() { int o = __offset(6); return o != 0 ? __vector_len(o) : 0; } - public String outputs(int j) { int o = __offset(8); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int outputsLength() { int o = __offset(8); return o != 0 ? __vector_len(o) : 0; } - public UIVariable variables(int j) { return variables(new UIVariable(), j); } - public UIVariable variables(UIVariable obj, int j) { int o = __offset(10); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int variablesLength() { int o = __offset(10); return o != 0 ? __vector_len(o) : 0; } - public UIOp ops(int j) { return ops(new UIOp(), j); } - public UIOp ops(UIOp obj, int j) { int o = __offset(12); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int opsLength() { int o = __offset(12); return o != 0 ? __vector_len(o) : 0; } - - public static int createUIGraphStructure(FlatBufferBuilder builder, - int inputsOffset, - int inputsPairOffset, - int outputsOffset, - int variablesOffset, - int opsOffset) { - builder.startObject(5); - UIGraphStructure.addOps(builder, opsOffset); - UIGraphStructure.addVariables(builder, variablesOffset); - UIGraphStructure.addOutputs(builder, outputsOffset); - UIGraphStructure.addInputsPair(builder, inputsPairOffset); - UIGraphStructure.addInputs(builder, inputsOffset); - return UIGraphStructure.endUIGraphStructure(builder); - } - - public static void startUIGraphStructure(FlatBufferBuilder builder) { builder.startObject(5); } - public static void addInputs(FlatBufferBuilder builder, int inputsOffset) { builder.addOffset(0, inputsOffset, 0); } - public static int createInputsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startInputsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addInputsPair(FlatBufferBuilder builder, int inputsPairOffset) { builder.addOffset(1, inputsPairOffset, 0); } - public static int createInputsPairVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startInputsPairVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addOutputs(FlatBufferBuilder builder, int outputsOffset) { builder.addOffset(2, outputsOffset, 0); } - public static int createOutputsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startOutputsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addVariables(FlatBufferBuilder builder, int variablesOffset) { builder.addOffset(3, variablesOffset, 0); } - public static int createVariablesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startVariablesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addOps(FlatBufferBuilder builder, int opsOffset) { builder.addOffset(4, opsOffset, 0); } - public static int createOpsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startOpsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static int endUIGraphStructure(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.py b/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.py deleted file mode 100644 index b7cf79edf..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIGraphStructure.py +++ /dev/null @@ -1,136 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UIGraphStructure(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUIGraphStructure(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UIGraphStructure() - x.Init(buf, n + offset) - return x - - # UIGraphStructure - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UIGraphStructure - def Inputs(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIGraphStructure - def InputsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIGraphStructure - def InputsPair(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .IntPair import IntPair - obj = IntPair() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UIGraphStructure - def InputsPairLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIGraphStructure - def Outputs(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIGraphStructure - def OutputsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIGraphStructure - def Variables(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .UIVariable import UIVariable - obj = UIVariable() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UIGraphStructure - def VariablesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIGraphStructure - def Ops(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .UIOp import UIOp - obj = UIOp() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UIGraphStructure - def OpsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - -def UIGraphStructureStart(builder): builder.StartObject(5) -def UIGraphStructureAddInputs(builder, inputs): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(inputs), 0) -def UIGraphStructureStartInputsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIGraphStructureAddInputsPair(builder, inputsPair): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(inputsPair), 0) -def UIGraphStructureStartInputsPairVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIGraphStructureAddOutputs(builder, outputs): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(outputs), 0) -def UIGraphStructureStartOutputsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIGraphStructureAddVariables(builder, variables): builder.PrependUOffsetTRelativeSlot(3, flatbuffers.number_types.UOffsetTFlags.py_type(variables), 0) -def UIGraphStructureStartVariablesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIGraphStructureAddOps(builder, ops): builder.PrependUOffsetTRelativeSlot(4, flatbuffers.number_types.UOffsetTFlags.py_type(ops), 0) -def UIGraphStructureStartOpsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIGraphStructureEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.cs b/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.cs deleted file mode 100644 index a3444a2ea..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.cs +++ /dev/null @@ -1,52 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UIHardwareState : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UIHardwareState GetRootAsUIHardwareState(ByteBuffer _bb) { return GetRootAsUIHardwareState(_bb, new UIHardwareState()); } - public static UIHardwareState GetRootAsUIHardwareState(ByteBuffer _bb, UIHardwareState obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UIHardwareState __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long GpuMemory(int j) { int o = __p.__offset(4); return o != 0 ? __p.bb.GetLong(__p.__vector(o) + j * 8) : (long)0; } - public int GpuMemoryLength { get { int o = __p.__offset(4); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetGpuMemoryBytes() { return __p.__vector_as_span(4); } -#else - public ArraySegment? GetGpuMemoryBytes() { return __p.__vector_as_arraysegment(4); } -#endif - public long[] GetGpuMemoryArray() { return __p.__vector_as_array(4); } - public long HostMemory { get { int o = __p.__offset(6); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - - public static Offset CreateUIHardwareState(FlatBufferBuilder builder, - VectorOffset gpuMemoryOffset = default(VectorOffset), - long hostMemory = 0) { - builder.StartObject(2); - UIHardwareState.AddHostMemory(builder, hostMemory); - UIHardwareState.AddGpuMemory(builder, gpuMemoryOffset); - return UIHardwareState.EndUIHardwareState(builder); - } - - public static void StartUIHardwareState(FlatBufferBuilder builder) { builder.StartObject(2); } - public static void AddGpuMemory(FlatBufferBuilder builder, VectorOffset gpuMemoryOffset) { builder.AddOffset(0, gpuMemoryOffset.Value, 0); } - public static VectorOffset CreateGpuMemoryVector(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); for (int i = data.Length - 1; i >= 0; i--) builder.AddLong(data[i]); return builder.EndVector(); } - public static VectorOffset CreateGpuMemoryVectorBlock(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); builder.Add(data); return builder.EndVector(); } - public static void StartGpuMemoryVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(8, numElems, 8); } - public static void AddHostMemory(FlatBufferBuilder builder, long hostMemory) { builder.AddLong(1, hostMemory, 0); } - public static Offset EndUIHardwareState(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.java b/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.java deleted file mode 100644 index 469f6357d..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.java +++ /dev/null @@ -1,42 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UIHardwareState extends Table { - public static UIHardwareState getRootAsUIHardwareState(ByteBuffer _bb) { return getRootAsUIHardwareState(_bb, new UIHardwareState()); } - public static UIHardwareState getRootAsUIHardwareState(ByteBuffer _bb, UIHardwareState obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UIHardwareState __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long gpuMemory(int j) { int o = __offset(4); return o != 0 ? bb.getLong(__vector(o) + j * 8) : 0; } - public int gpuMemoryLength() { int o = __offset(4); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer gpuMemoryAsByteBuffer() { return __vector_as_bytebuffer(4, 8); } - public ByteBuffer gpuMemoryInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 4, 8); } - public long hostMemory() { int o = __offset(6); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - - public static int createUIHardwareState(FlatBufferBuilder builder, - int gpuMemoryOffset, - long hostMemory) { - builder.startObject(2); - UIHardwareState.addHostMemory(builder, hostMemory); - UIHardwareState.addGpuMemory(builder, gpuMemoryOffset); - return UIHardwareState.endUIHardwareState(builder); - } - - public static void startUIHardwareState(FlatBufferBuilder builder) { builder.startObject(2); } - public static void addGpuMemory(FlatBufferBuilder builder, int gpuMemoryOffset) { builder.addOffset(0, gpuMemoryOffset, 0); } - public static int createGpuMemoryVector(FlatBufferBuilder builder, long[] data) { builder.startVector(8, data.length, 8); for (int i = data.length - 1; i >= 0; i--) builder.addLong(data[i]); return builder.endVector(); } - public static void startGpuMemoryVector(FlatBufferBuilder builder, int numElems) { builder.startVector(8, numElems, 8); } - public static void addHostMemory(FlatBufferBuilder builder, long hostMemory) { builder.addLong(1, hostMemory, 0L); } - public static int endUIHardwareState(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.py b/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.py deleted file mode 100644 index 82d20a520..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHardwareState.py +++ /dev/null @@ -1,68 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UIHardwareState(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUIHardwareState(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UIHardwareState() - x.Init(buf, n + offset) - return x - - # UIHardwareState - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UIHardwareState - def GpuMemory(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int64Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 8)) - return 0 - - # UIHardwareState - def GpuMemoryAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int64Flags, o) - return 0 - - # UIHardwareState - def GpuMemoryLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIHardwareState - def HostMemory(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - -def UIHardwareStateStart(builder): builder.StartObject(2) -def UIHardwareStateAddGpuMemory(builder, gpuMemory): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(gpuMemory), 0) -def UIHardwareStateStartGpuMemoryVector(builder, numElems): return builder.StartVector(8, numElems, 8) -def UIHardwareStateAddHostMemory(builder, hostMemory): builder.PrependInt64Slot(1, hostMemory, 0) -def UIHardwareStateEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.cs b/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.cs deleted file mode 100644 index 1801bef68..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.cs +++ /dev/null @@ -1,58 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UIHistogram : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UIHistogram GetRootAsUIHistogram(ByteBuffer _bb) { return GetRootAsUIHistogram(_bb, new UIHistogram()); } - public static UIHistogram GetRootAsUIHistogram(ByteBuffer _bb, UIHistogram obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UIHistogram __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public UIHistogramType Type { get { int o = __p.__offset(4); return o != 0 ? (UIHistogramType)__p.bb.GetSbyte(o + __p.bb_pos) : UIHistogramType.DISCRETE; } } - public uint Numbins { get { int o = __p.__offset(6); return o != 0 ? __p.bb.GetUint(o + __p.bb_pos) : (uint)0; } } - public FlatArray? Binranges { get { int o = __p.__offset(8); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public FlatArray? Y { get { int o = __p.__offset(10); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public string Binlabels(int j) { int o = __p.__offset(12); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int BinlabelsLength { get { int o = __p.__offset(12); return o != 0 ? __p.__vector_len(o) : 0; } } - - public static Offset CreateUIHistogram(FlatBufferBuilder builder, - UIHistogramType type = UIHistogramType.DISCRETE, - uint numbins = 0, - Offset binrangesOffset = default(Offset), - Offset yOffset = default(Offset), - VectorOffset binlabelsOffset = default(VectorOffset)) { - builder.StartObject(5); - UIHistogram.AddBinlabels(builder, binlabelsOffset); - UIHistogram.AddY(builder, yOffset); - UIHistogram.AddBinranges(builder, binrangesOffset); - UIHistogram.AddNumbins(builder, numbins); - UIHistogram.AddType(builder, type); - return UIHistogram.EndUIHistogram(builder); - } - - public static void StartUIHistogram(FlatBufferBuilder builder) { builder.StartObject(5); } - public static void AddType(FlatBufferBuilder builder, UIHistogramType type) { builder.AddSbyte(0, (sbyte)type, 0); } - public static void AddNumbins(FlatBufferBuilder builder, uint numbins) { builder.AddUint(1, numbins, 0); } - public static void AddBinranges(FlatBufferBuilder builder, Offset binrangesOffset) { builder.AddOffset(2, binrangesOffset.Value, 0); } - public static void AddY(FlatBufferBuilder builder, Offset yOffset) { builder.AddOffset(3, yOffset.Value, 0); } - public static void AddBinlabels(FlatBufferBuilder builder, VectorOffset binlabelsOffset) { builder.AddOffset(4, binlabelsOffset.Value, 0); } - public static VectorOffset CreateBinlabelsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateBinlabelsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartBinlabelsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static Offset EndUIHistogram(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.java b/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.java deleted file mode 100644 index eea513be4..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.java +++ /dev/null @@ -1,54 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UIHistogram extends Table { - public static UIHistogram getRootAsUIHistogram(ByteBuffer _bb) { return getRootAsUIHistogram(_bb, new UIHistogram()); } - public static UIHistogram getRootAsUIHistogram(ByteBuffer _bb, UIHistogram obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UIHistogram __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public byte type() { int o = __offset(4); return o != 0 ? bb.get(o + bb_pos) : 0; } - public long numbins() { int o = __offset(6); return o != 0 ? (long)bb.getInt(o + bb_pos) & 0xFFFFFFFFL : 0L; } - public FlatArray binranges() { return binranges(new FlatArray()); } - public FlatArray binranges(FlatArray obj) { int o = __offset(8); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public FlatArray y() { return y(new FlatArray()); } - public FlatArray y(FlatArray obj) { int o = __offset(10); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public String binlabels(int j) { int o = __offset(12); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int binlabelsLength() { int o = __offset(12); return o != 0 ? __vector_len(o) : 0; } - - public static int createUIHistogram(FlatBufferBuilder builder, - byte type, - long numbins, - int binrangesOffset, - int yOffset, - int binlabelsOffset) { - builder.startObject(5); - UIHistogram.addBinlabels(builder, binlabelsOffset); - UIHistogram.addY(builder, yOffset); - UIHistogram.addBinranges(builder, binrangesOffset); - UIHistogram.addNumbins(builder, numbins); - UIHistogram.addType(builder, type); - return UIHistogram.endUIHistogram(builder); - } - - public static void startUIHistogram(FlatBufferBuilder builder) { builder.startObject(5); } - public static void addType(FlatBufferBuilder builder, byte type) { builder.addByte(0, type, 0); } - public static void addNumbins(FlatBufferBuilder builder, long numbins) { builder.addInt(1, (int)numbins, (int)0L); } - public static void addBinranges(FlatBufferBuilder builder, int binrangesOffset) { builder.addOffset(2, binrangesOffset, 0); } - public static void addY(FlatBufferBuilder builder, int yOffset) { builder.addOffset(3, yOffset, 0); } - public static void addBinlabels(FlatBufferBuilder builder, int binlabelsOffset) { builder.addOffset(4, binlabelsOffset, 0); } - public static int createBinlabelsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startBinlabelsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static int endUIHistogram(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.py b/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.py deleted file mode 100644 index 12a626396..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHistogram.py +++ /dev/null @@ -1,93 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UIHistogram(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUIHistogram(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UIHistogram() - x.Init(buf, n + offset) - return x - - # UIHistogram - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UIHistogram - def Type(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # UIHistogram - def Numbins(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Uint32Flags, o + self._tab.Pos) - return 0 - - # UIHistogram - def Binranges(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UIHistogram - def Y(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UIHistogram - def Binlabels(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIHistogram - def BinlabelsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - -def UIHistogramStart(builder): builder.StartObject(5) -def UIHistogramAddType(builder, type): builder.PrependInt8Slot(0, type, 0) -def UIHistogramAddNumbins(builder, numbins): builder.PrependUint32Slot(1, numbins, 0) -def UIHistogramAddBinranges(builder, binranges): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(binranges), 0) -def UIHistogramAddY(builder, y): builder.PrependUOffsetTRelativeSlot(3, flatbuffers.number_types.UOffsetTFlags.py_type(y), 0) -def UIHistogramAddBinlabels(builder, binlabels): builder.PrependUOffsetTRelativeSlot(4, flatbuffers.number_types.UOffsetTFlags.py_type(binlabels), 0) -def UIHistogramStartBinlabelsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIHistogramEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.cs b/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.cs deleted file mode 100644 index 3b8c3218b..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.cs +++ /dev/null @@ -1,16 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum UIHistogramType : sbyte -{ - DISCRETE = 0, - EQUAL_SPACING = 1, - CUSTOM = 2, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.java b/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.java deleted file mode 100644 index eed543ca5..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.java +++ /dev/null @@ -1,15 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class UIHistogramType { - private UIHistogramType() { } - public static final byte DISCRETE = 0; - public static final byte EQUAL_SPACING = 1; - public static final byte CUSTOM = 2; - - public static final String[] names = { "DISCRETE", "EQUAL_SPACING", "CUSTOM", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.py b/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.py deleted file mode 100644 index 751e9aa21..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIHistogramType.py +++ /dev/null @@ -1,23 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class UIHistogramType(object): - DISCRETE = 0 - EQUAL_SPACING = 1 - CUSTOM = 2 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.cs b/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.cs deleted file mode 100644 index 676d60530..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.cs +++ /dev/null @@ -1,16 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum UIInfoType : sbyte -{ - GRAPH_STRUCTURE = 0, - SYTEM_INFO = 1, - START_EVENTS = 2, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.java b/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.java deleted file mode 100644 index a2792912c..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.java +++ /dev/null @@ -1,15 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class UIInfoType { - private UIInfoType() { } - public static final byte GRAPH_STRUCTURE = 0; - public static final byte SYTEM_INFO = 1; - public static final byte START_EVENTS = 2; - - public static final String[] names = { "GRAPH_STRUCTURE", "SYTEM_INFO", "START_EVENTS", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.py b/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.py deleted file mode 100644 index ce2541de7..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIInfoType.py +++ /dev/null @@ -1,23 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class UIInfoType(object): - GRAPH_STRUCTURE = 0 - SYTEM_INFO = 1 - START_EVENTS = 2 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIOp.cs b/libnd4j/include/graph/generated/nd4j/graph/UIOp.cs deleted file mode 100644 index f8b3b4fbd..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIOp.cs +++ /dev/null @@ -1,88 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UIOp : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UIOp GetRootAsUIOp(ByteBuffer _bb) { return GetRootAsUIOp(_bb, new UIOp()); } - public static UIOp GetRootAsUIOp(ByteBuffer _bb, UIOp obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UIOp __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public string Name { get { int o = __p.__offset(4); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetNameBytes() { return __p.__vector_as_span(4); } -#else - public ArraySegment? GetNameBytes() { return __p.__vector_as_arraysegment(4); } -#endif - public byte[] GetNameArray() { return __p.__vector_as_array(4); } - public string OpName { get { int o = __p.__offset(6); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetOpNameBytes() { return __p.__vector_as_span(6); } -#else - public ArraySegment? GetOpNameBytes() { return __p.__vector_as_arraysegment(6); } -#endif - public byte[] GetOpNameArray() { return __p.__vector_as_array(6); } - public string Inputs(int j) { int o = __p.__offset(8); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int InputsLength { get { int o = __p.__offset(8); return o != 0 ? __p.__vector_len(o) : 0; } } - public string Outputs(int j) { int o = __p.__offset(10); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int OutputsLength { get { int o = __p.__offset(10); return o != 0 ? __p.__vector_len(o) : 0; } } - public string ControlDeps(int j) { int o = __p.__offset(12); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepsLength { get { int o = __p.__offset(12); return o != 0 ? __p.__vector_len(o) : 0; } } - public string UiLabelExtra { get { int o = __p.__offset(14); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetUiLabelExtraBytes() { return __p.__vector_as_span(14); } -#else - public ArraySegment? GetUiLabelExtraBytes() { return __p.__vector_as_arraysegment(14); } -#endif - public byte[] GetUiLabelExtraArray() { return __p.__vector_as_array(14); } - - public static Offset CreateUIOp(FlatBufferBuilder builder, - StringOffset nameOffset = default(StringOffset), - StringOffset opNameOffset = default(StringOffset), - VectorOffset inputsOffset = default(VectorOffset), - VectorOffset outputsOffset = default(VectorOffset), - VectorOffset controlDepsOffset = default(VectorOffset), - StringOffset uiLabelExtraOffset = default(StringOffset)) { - builder.StartObject(6); - UIOp.AddUiLabelExtra(builder, uiLabelExtraOffset); - UIOp.AddControlDeps(builder, controlDepsOffset); - UIOp.AddOutputs(builder, outputsOffset); - UIOp.AddInputs(builder, inputsOffset); - UIOp.AddOpName(builder, opNameOffset); - UIOp.AddName(builder, nameOffset); - return UIOp.EndUIOp(builder); - } - - public static void StartUIOp(FlatBufferBuilder builder) { builder.StartObject(6); } - public static void AddName(FlatBufferBuilder builder, StringOffset nameOffset) { builder.AddOffset(0, nameOffset.Value, 0); } - public static void AddOpName(FlatBufferBuilder builder, StringOffset opNameOffset) { builder.AddOffset(1, opNameOffset.Value, 0); } - public static void AddInputs(FlatBufferBuilder builder, VectorOffset inputsOffset) { builder.AddOffset(2, inputsOffset.Value, 0); } - public static VectorOffset CreateInputsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateInputsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartInputsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddOutputs(FlatBufferBuilder builder, VectorOffset outputsOffset) { builder.AddOffset(3, outputsOffset.Value, 0); } - public static VectorOffset CreateOutputsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateOutputsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartOutputsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddControlDeps(FlatBufferBuilder builder, VectorOffset controlDepsOffset) { builder.AddOffset(4, controlDepsOffset.Value, 0); } - public static VectorOffset CreateControlDepsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddUiLabelExtra(FlatBufferBuilder builder, StringOffset uiLabelExtraOffset) { builder.AddOffset(5, uiLabelExtraOffset.Value, 0); } - public static Offset EndUIOp(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIOp.java b/libnd4j/include/graph/generated/nd4j/graph/UIOp.java deleted file mode 100644 index 5ca33cda5..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIOp.java +++ /dev/null @@ -1,68 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UIOp extends Table { - public static UIOp getRootAsUIOp(ByteBuffer _bb) { return getRootAsUIOp(_bb, new UIOp()); } - public static UIOp getRootAsUIOp(ByteBuffer _bb, UIOp obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UIOp __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public String name() { int o = __offset(4); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer nameAsByteBuffer() { return __vector_as_bytebuffer(4, 1); } - public ByteBuffer nameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 4, 1); } - public String opName() { int o = __offset(6); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer opNameAsByteBuffer() { return __vector_as_bytebuffer(6, 1); } - public ByteBuffer opNameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 6, 1); } - public String inputs(int j) { int o = __offset(8); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int inputsLength() { int o = __offset(8); return o != 0 ? __vector_len(o) : 0; } - public String outputs(int j) { int o = __offset(10); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int outputsLength() { int o = __offset(10); return o != 0 ? __vector_len(o) : 0; } - public String controlDeps(int j) { int o = __offset(12); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepsLength() { int o = __offset(12); return o != 0 ? __vector_len(o) : 0; } - public String uiLabelExtra() { int o = __offset(14); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer uiLabelExtraAsByteBuffer() { return __vector_as_bytebuffer(14, 1); } - public ByteBuffer uiLabelExtraInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 14, 1); } - - public static int createUIOp(FlatBufferBuilder builder, - int nameOffset, - int opNameOffset, - int inputsOffset, - int outputsOffset, - int controlDepsOffset, - int uiLabelExtraOffset) { - builder.startObject(6); - UIOp.addUiLabelExtra(builder, uiLabelExtraOffset); - UIOp.addControlDeps(builder, controlDepsOffset); - UIOp.addOutputs(builder, outputsOffset); - UIOp.addInputs(builder, inputsOffset); - UIOp.addOpName(builder, opNameOffset); - UIOp.addName(builder, nameOffset); - return UIOp.endUIOp(builder); - } - - public static void startUIOp(FlatBufferBuilder builder) { builder.startObject(6); } - public static void addName(FlatBufferBuilder builder, int nameOffset) { builder.addOffset(0, nameOffset, 0); } - public static void addOpName(FlatBufferBuilder builder, int opNameOffset) { builder.addOffset(1, opNameOffset, 0); } - public static void addInputs(FlatBufferBuilder builder, int inputsOffset) { builder.addOffset(2, inputsOffset, 0); } - public static int createInputsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startInputsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addOutputs(FlatBufferBuilder builder, int outputsOffset) { builder.addOffset(3, outputsOffset, 0); } - public static int createOutputsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startOutputsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addControlDeps(FlatBufferBuilder builder, int controlDepsOffset) { builder.addOffset(4, controlDepsOffset, 0); } - public static int createControlDepsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addUiLabelExtra(FlatBufferBuilder builder, int uiLabelExtraOffset) { builder.addOffset(5, uiLabelExtraOffset, 0); } - public static int endUIOp(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIOp.py b/libnd4j/include/graph/generated/nd4j/graph/UIOp.py deleted file mode 100644 index 843441d2d..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIOp.py +++ /dev/null @@ -1,111 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UIOp(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUIOp(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UIOp() - x.Init(buf, n + offset) - return x - - # UIOp - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UIOp - def Name(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # UIOp - def OpName(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # UIOp - def Inputs(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIOp - def InputsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIOp - def Outputs(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIOp - def OutputsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIOp - def ControlDeps(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIOp - def ControlDepsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIOp - def UiLabelExtra(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - -def UIOpStart(builder): builder.StartObject(6) -def UIOpAddName(builder, name): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(name), 0) -def UIOpAddOpName(builder, opName): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(opName), 0) -def UIOpAddInputs(builder, inputs): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(inputs), 0) -def UIOpStartInputsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIOpAddOutputs(builder, outputs): builder.PrependUOffsetTRelativeSlot(3, flatbuffers.number_types.UOffsetTFlags.py_type(outputs), 0) -def UIOpStartOutputsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIOpAddControlDeps(builder, controlDeps): builder.PrependUOffsetTRelativeSlot(4, flatbuffers.number_types.UOffsetTFlags.py_type(controlDeps), 0) -def UIOpStartControlDepsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIOpAddUiLabelExtra(builder, uiLabelExtra): builder.PrependUOffsetTRelativeSlot(5, flatbuffers.number_types.UOffsetTFlags.py_type(uiLabelExtra), 0) -def UIOpEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.cs b/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.cs deleted file mode 100644 index 410a3c37b..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.cs +++ /dev/null @@ -1,38 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UIStaticInfoRecord : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UIStaticInfoRecord GetRootAsUIStaticInfoRecord(ByteBuffer _bb) { return GetRootAsUIStaticInfoRecord(_bb, new UIStaticInfoRecord()); } - public static UIStaticInfoRecord GetRootAsUIStaticInfoRecord(ByteBuffer _bb, UIStaticInfoRecord obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UIStaticInfoRecord __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public UIInfoType InfoType { get { int o = __p.__offset(4); return o != 0 ? (UIInfoType)__p.bb.GetSbyte(o + __p.bb_pos) : UIInfoType.GRAPH_STRUCTURE; } } - - public static Offset CreateUIStaticInfoRecord(FlatBufferBuilder builder, - UIInfoType infoType = UIInfoType.GRAPH_STRUCTURE) { - builder.StartObject(1); - UIStaticInfoRecord.AddInfoType(builder, infoType); - return UIStaticInfoRecord.EndUIStaticInfoRecord(builder); - } - - public static void StartUIStaticInfoRecord(FlatBufferBuilder builder) { builder.StartObject(1); } - public static void AddInfoType(FlatBufferBuilder builder, UIInfoType infoType) { builder.AddSbyte(0, (sbyte)infoType, 0); } - public static Offset EndUIStaticInfoRecord(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.java b/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.java deleted file mode 100644 index 45dc5a961..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.java +++ /dev/null @@ -1,33 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UIStaticInfoRecord extends Table { - public static UIStaticInfoRecord getRootAsUIStaticInfoRecord(ByteBuffer _bb) { return getRootAsUIStaticInfoRecord(_bb, new UIStaticInfoRecord()); } - public static UIStaticInfoRecord getRootAsUIStaticInfoRecord(ByteBuffer _bb, UIStaticInfoRecord obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UIStaticInfoRecord __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public byte infoType() { int o = __offset(4); return o != 0 ? bb.get(o + bb_pos) : 0; } - - public static int createUIStaticInfoRecord(FlatBufferBuilder builder, - byte infoType) { - builder.startObject(1); - UIStaticInfoRecord.addInfoType(builder, infoType); - return UIStaticInfoRecord.endUIStaticInfoRecord(builder); - } - - public static void startUIStaticInfoRecord(FlatBufferBuilder builder) { builder.startObject(1); } - public static void addInfoType(FlatBufferBuilder builder, byte infoType) { builder.addByte(0, infoType, 0); } - public static int endUIStaticInfoRecord(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.py b/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.py deleted file mode 100644 index 4b0fceead..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIStaticInfoRecord.py +++ /dev/null @@ -1,44 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UIStaticInfoRecord(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUIStaticInfoRecord(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UIStaticInfoRecord() - x.Init(buf, n + offset) - return x - - # UIStaticInfoRecord - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UIStaticInfoRecord - def InfoType(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - -def UIStaticInfoRecordStart(builder): builder.StartObject(1) -def UIStaticInfoRecordAddInfoType(builder, infoType): builder.PrependInt8Slot(0, infoType, 0) -def UIStaticInfoRecordEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.cs b/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.cs deleted file mode 100644 index 0f63d2d7c..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.cs +++ /dev/null @@ -1,74 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UISummaryStatistics : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UISummaryStatistics GetRootAsUISummaryStatistics(ByteBuffer _bb) { return GetRootAsUISummaryStatistics(_bb, new UISummaryStatistics()); } - public static UISummaryStatistics GetRootAsUISummaryStatistics(ByteBuffer _bb, UISummaryStatistics obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UISummaryStatistics __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public uint Bitmask { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetUint(o + __p.bb_pos) : (uint)0; } } - public FlatArray? Min { get { int o = __p.__offset(6); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public FlatArray? Max { get { int o = __p.__offset(8); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public double Mean { get { int o = __p.__offset(10); return o != 0 ? __p.bb.GetDouble(o + __p.bb_pos) : (double)0.0; } } - public double Stdev { get { int o = __p.__offset(12); return o != 0 ? __p.bb.GetDouble(o + __p.bb_pos) : (double)0.0; } } - public long Countzero { get { int o = __p.__offset(14); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long Countpositive { get { int o = __p.__offset(16); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long Countnegative { get { int o = __p.__offset(18); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long Countnan { get { int o = __p.__offset(20); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - public long Countinf { get { int o = __p.__offset(22); return o != 0 ? __p.bb.GetLong(o + __p.bb_pos) : (long)0; } } - - public static Offset CreateUISummaryStatistics(FlatBufferBuilder builder, - uint bitmask = 0, - Offset minOffset = default(Offset), - Offset maxOffset = default(Offset), - double mean = 0.0, - double stdev = 0.0, - long countzero = 0, - long countpositive = 0, - long countnegative = 0, - long countnan = 0, - long countinf = 0) { - builder.StartObject(10); - UISummaryStatistics.AddCountinf(builder, countinf); - UISummaryStatistics.AddCountnan(builder, countnan); - UISummaryStatistics.AddCountnegative(builder, countnegative); - UISummaryStatistics.AddCountpositive(builder, countpositive); - UISummaryStatistics.AddCountzero(builder, countzero); - UISummaryStatistics.AddStdev(builder, stdev); - UISummaryStatistics.AddMean(builder, mean); - UISummaryStatistics.AddMax(builder, maxOffset); - UISummaryStatistics.AddMin(builder, minOffset); - UISummaryStatistics.AddBitmask(builder, bitmask); - return UISummaryStatistics.EndUISummaryStatistics(builder); - } - - public static void StartUISummaryStatistics(FlatBufferBuilder builder) { builder.StartObject(10); } - public static void AddBitmask(FlatBufferBuilder builder, uint bitmask) { builder.AddUint(0, bitmask, 0); } - public static void AddMin(FlatBufferBuilder builder, Offset minOffset) { builder.AddOffset(1, minOffset.Value, 0); } - public static void AddMax(FlatBufferBuilder builder, Offset maxOffset) { builder.AddOffset(2, maxOffset.Value, 0); } - public static void AddMean(FlatBufferBuilder builder, double mean) { builder.AddDouble(3, mean, 0.0); } - public static void AddStdev(FlatBufferBuilder builder, double stdev) { builder.AddDouble(4, stdev, 0.0); } - public static void AddCountzero(FlatBufferBuilder builder, long countzero) { builder.AddLong(5, countzero, 0); } - public static void AddCountpositive(FlatBufferBuilder builder, long countpositive) { builder.AddLong(6, countpositive, 0); } - public static void AddCountnegative(FlatBufferBuilder builder, long countnegative) { builder.AddLong(7, countnegative, 0); } - public static void AddCountnan(FlatBufferBuilder builder, long countnan) { builder.AddLong(8, countnan, 0); } - public static void AddCountinf(FlatBufferBuilder builder, long countinf) { builder.AddLong(9, countinf, 0); } - public static Offset EndUISummaryStatistics(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.java b/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.java deleted file mode 100644 index ddc9d776a..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.java +++ /dev/null @@ -1,71 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UISummaryStatistics extends Table { - public static UISummaryStatistics getRootAsUISummaryStatistics(ByteBuffer _bb) { return getRootAsUISummaryStatistics(_bb, new UISummaryStatistics()); } - public static UISummaryStatistics getRootAsUISummaryStatistics(ByteBuffer _bb, UISummaryStatistics obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UISummaryStatistics __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public long bitmask() { int o = __offset(4); return o != 0 ? (long)bb.getInt(o + bb_pos) & 0xFFFFFFFFL : 0L; } - public FlatArray min() { return min(new FlatArray()); } - public FlatArray min(FlatArray obj) { int o = __offset(6); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public FlatArray max() { return max(new FlatArray()); } - public FlatArray max(FlatArray obj) { int o = __offset(8); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public double mean() { int o = __offset(10); return o != 0 ? bb.getDouble(o + bb_pos) : 0.0; } - public double stdev() { int o = __offset(12); return o != 0 ? bb.getDouble(o + bb_pos) : 0.0; } - public long countzero() { int o = __offset(14); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long countpositive() { int o = __offset(16); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long countnegative() { int o = __offset(18); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long countnan() { int o = __offset(20); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - public long countinf() { int o = __offset(22); return o != 0 ? bb.getLong(o + bb_pos) : 0L; } - - public static int createUISummaryStatistics(FlatBufferBuilder builder, - long bitmask, - int minOffset, - int maxOffset, - double mean, - double stdev, - long countzero, - long countpositive, - long countnegative, - long countnan, - long countinf) { - builder.startObject(10); - UISummaryStatistics.addCountinf(builder, countinf); - UISummaryStatistics.addCountnan(builder, countnan); - UISummaryStatistics.addCountnegative(builder, countnegative); - UISummaryStatistics.addCountpositive(builder, countpositive); - UISummaryStatistics.addCountzero(builder, countzero); - UISummaryStatistics.addStdev(builder, stdev); - UISummaryStatistics.addMean(builder, mean); - UISummaryStatistics.addMax(builder, maxOffset); - UISummaryStatistics.addMin(builder, minOffset); - UISummaryStatistics.addBitmask(builder, bitmask); - return UISummaryStatistics.endUISummaryStatistics(builder); - } - - public static void startUISummaryStatistics(FlatBufferBuilder builder) { builder.startObject(10); } - public static void addBitmask(FlatBufferBuilder builder, long bitmask) { builder.addInt(0, (int)bitmask, (int)0L); } - public static void addMin(FlatBufferBuilder builder, int minOffset) { builder.addOffset(1, minOffset, 0); } - public static void addMax(FlatBufferBuilder builder, int maxOffset) { builder.addOffset(2, maxOffset, 0); } - public static void addMean(FlatBufferBuilder builder, double mean) { builder.addDouble(3, mean, 0.0); } - public static void addStdev(FlatBufferBuilder builder, double stdev) { builder.addDouble(4, stdev, 0.0); } - public static void addCountzero(FlatBufferBuilder builder, long countzero) { builder.addLong(5, countzero, 0L); } - public static void addCountpositive(FlatBufferBuilder builder, long countpositive) { builder.addLong(6, countpositive, 0L); } - public static void addCountnegative(FlatBufferBuilder builder, long countnegative) { builder.addLong(7, countnegative, 0L); } - public static void addCountnan(FlatBufferBuilder builder, long countnan) { builder.addLong(8, countnan, 0L); } - public static void addCountinf(FlatBufferBuilder builder, long countinf) { builder.addLong(9, countinf, 0L); } - public static int endUISummaryStatistics(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.py b/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.py deleted file mode 100644 index a4511829a..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UISummaryStatistics.py +++ /dev/null @@ -1,124 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UISummaryStatistics(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUISummaryStatistics(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UISummaryStatistics() - x.Init(buf, n + offset) - return x - - # UISummaryStatistics - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UISummaryStatistics - def Bitmask(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Uint32Flags, o + self._tab.Pos) - return 0 - - # UISummaryStatistics - def Min(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UISummaryStatistics - def Max(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UISummaryStatistics - def Mean(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Float64Flags, o + self._tab.Pos) - return 0.0 - - # UISummaryStatistics - def Stdev(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Float64Flags, o + self._tab.Pos) - return 0.0 - - # UISummaryStatistics - def Countzero(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # UISummaryStatistics - def Countpositive(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # UISummaryStatistics - def Countnegative(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # UISummaryStatistics - def Countnan(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - - # UISummaryStatistics - def Countinf(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int64Flags, o + self._tab.Pos) - return 0 - -def UISummaryStatisticsStart(builder): builder.StartObject(10) -def UISummaryStatisticsAddBitmask(builder, bitmask): builder.PrependUint32Slot(0, bitmask, 0) -def UISummaryStatisticsAddMin(builder, min): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(min), 0) -def UISummaryStatisticsAddMax(builder, max): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(max), 0) -def UISummaryStatisticsAddMean(builder, mean): builder.PrependFloat64Slot(3, mean, 0.0) -def UISummaryStatisticsAddStdev(builder, stdev): builder.PrependFloat64Slot(4, stdev, 0.0) -def UISummaryStatisticsAddCountzero(builder, countzero): builder.PrependInt64Slot(5, countzero, 0) -def UISummaryStatisticsAddCountpositive(builder, countpositive): builder.PrependInt64Slot(6, countpositive, 0) -def UISummaryStatisticsAddCountnegative(builder, countnegative): builder.PrependInt64Slot(7, countnegative, 0) -def UISummaryStatisticsAddCountnan(builder, countnan): builder.PrependInt64Slot(8, countnan, 0) -def UISummaryStatisticsAddCountinf(builder, countinf): builder.PrependInt64Slot(9, countinf, 0) -def UISummaryStatisticsEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.cs b/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.cs deleted file mode 100644 index 6adbcf98e..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.cs +++ /dev/null @@ -1,38 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UISystemInfo : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UISystemInfo GetRootAsUISystemInfo(ByteBuffer _bb) { return GetRootAsUISystemInfo(_bb, new UISystemInfo()); } - public static UISystemInfo GetRootAsUISystemInfo(ByteBuffer _bb, UISystemInfo obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UISystemInfo __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int PhysicalCores { get { int o = __p.__offset(4); return o != 0 ? __p.bb.GetInt(o + __p.bb_pos) : (int)0; } } - - public static Offset CreateUISystemInfo(FlatBufferBuilder builder, - int physicalCores = 0) { - builder.StartObject(1); - UISystemInfo.AddPhysicalCores(builder, physicalCores); - return UISystemInfo.EndUISystemInfo(builder); - } - - public static void StartUISystemInfo(FlatBufferBuilder builder) { builder.StartObject(1); } - public static void AddPhysicalCores(FlatBufferBuilder builder, int physicalCores) { builder.AddInt(0, physicalCores, 0); } - public static Offset EndUISystemInfo(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.java b/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.java deleted file mode 100644 index 4ff62ab98..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.java +++ /dev/null @@ -1,33 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UISystemInfo extends Table { - public static UISystemInfo getRootAsUISystemInfo(ByteBuffer _bb) { return getRootAsUISystemInfo(_bb, new UISystemInfo()); } - public static UISystemInfo getRootAsUISystemInfo(ByteBuffer _bb, UISystemInfo obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UISystemInfo __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public int physicalCores() { int o = __offset(4); return o != 0 ? bb.getInt(o + bb_pos) : 0; } - - public static int createUISystemInfo(FlatBufferBuilder builder, - int physicalCores) { - builder.startObject(1); - UISystemInfo.addPhysicalCores(builder, physicalCores); - return UISystemInfo.endUISystemInfo(builder); - } - - public static void startUISystemInfo(FlatBufferBuilder builder) { builder.startObject(1); } - public static void addPhysicalCores(FlatBufferBuilder builder, int physicalCores) { builder.addInt(0, physicalCores, 0); } - public static int endUISystemInfo(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.py b/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.py deleted file mode 100644 index dc920fc78..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UISystemInfo.py +++ /dev/null @@ -1,44 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UISystemInfo(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUISystemInfo(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UISystemInfo() - x.Init(buf, n + offset) - return x - - # UISystemInfo - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UISystemInfo - def PhysicalCores(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int32Flags, o + self._tab.Pos) - return 0 - -def UISystemInfoStart(builder): builder.StartObject(1) -def UISystemInfoAddPhysicalCores(builder, physicalCores): builder.PrependInt32Slot(0, physicalCores, 0) -def UISystemInfoEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIVariable.cs b/libnd4j/include/graph/generated/nd4j/graph/UIVariable.cs deleted file mode 100644 index 1f8d14971..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIVariable.cs +++ /dev/null @@ -1,136 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UIVariable : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UIVariable GetRootAsUIVariable(ByteBuffer _bb) { return GetRootAsUIVariable(_bb, new UIVariable()); } - public static UIVariable GetRootAsUIVariable(ByteBuffer _bb, UIVariable obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UIVariable __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public IntPair? Id { get { int o = __p.__offset(4); return o != 0 ? (IntPair?)(new IntPair()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - public string Name { get { int o = __p.__offset(6); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetNameBytes() { return __p.__vector_as_span(6); } -#else - public ArraySegment? GetNameBytes() { return __p.__vector_as_arraysegment(6); } -#endif - public byte[] GetNameArray() { return __p.__vector_as_array(6); } - public VarType Type { get { int o = __p.__offset(8); return o != 0 ? (VarType)__p.bb.GetSbyte(o + __p.bb_pos) : VarType.VARIABLE; } } - public DType Datatype { get { int o = __p.__offset(10); return o != 0 ? (DType)__p.bb.GetSbyte(o + __p.bb_pos) : DType.INHERIT; } } - public long Shape(int j) { int o = __p.__offset(12); return o != 0 ? __p.bb.GetLong(__p.__vector(o) + j * 8) : (long)0; } - public int ShapeLength { get { int o = __p.__offset(12); return o != 0 ? __p.__vector_len(o) : 0; } } -#if ENABLE_SPAN_T - public Span GetShapeBytes() { return __p.__vector_as_span(12); } -#else - public ArraySegment? GetShapeBytes() { return __p.__vector_as_arraysegment(12); } -#endif - public long[] GetShapeArray() { return __p.__vector_as_array(12); } - public string ControlDeps(int j) { int o = __p.__offset(14); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepsLength { get { int o = __p.__offset(14); return o != 0 ? __p.__vector_len(o) : 0; } } - public string OutputOfOp { get { int o = __p.__offset(16); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetOutputOfOpBytes() { return __p.__vector_as_span(16); } -#else - public ArraySegment? GetOutputOfOpBytes() { return __p.__vector_as_arraysegment(16); } -#endif - public byte[] GetOutputOfOpArray() { return __p.__vector_as_array(16); } - public string InputsForOp(int j) { int o = __p.__offset(18); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int InputsForOpLength { get { int o = __p.__offset(18); return o != 0 ? __p.__vector_len(o) : 0; } } - public string ControlDepsForOp(int j) { int o = __p.__offset(20); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepsForOpLength { get { int o = __p.__offset(20); return o != 0 ? __p.__vector_len(o) : 0; } } - public string ControlDepsForVar(int j) { int o = __p.__offset(22); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int ControlDepsForVarLength { get { int o = __p.__offset(22); return o != 0 ? __p.__vector_len(o) : 0; } } - public string GradientVariable { get { int o = __p.__offset(24); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetGradientVariableBytes() { return __p.__vector_as_span(24); } -#else - public ArraySegment? GetGradientVariableBytes() { return __p.__vector_as_arraysegment(24); } -#endif - public byte[] GetGradientVariableArray() { return __p.__vector_as_array(24); } - public string UiLabelExtra { get { int o = __p.__offset(26); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetUiLabelExtraBytes() { return __p.__vector_as_span(26); } -#else - public ArraySegment? GetUiLabelExtraBytes() { return __p.__vector_as_arraysegment(26); } -#endif - public byte[] GetUiLabelExtraArray() { return __p.__vector_as_array(26); } - public FlatArray? ConstantValue { get { int o = __p.__offset(28); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(o + __p.bb_pos), __p.bb) : null; } } - - public static Offset CreateUIVariable(FlatBufferBuilder builder, - Offset idOffset = default(Offset), - StringOffset nameOffset = default(StringOffset), - VarType type = VarType.VARIABLE, - DType datatype = DType.INHERIT, - VectorOffset shapeOffset = default(VectorOffset), - VectorOffset controlDepsOffset = default(VectorOffset), - StringOffset outputOfOpOffset = default(StringOffset), - VectorOffset inputsForOpOffset = default(VectorOffset), - VectorOffset controlDepsForOpOffset = default(VectorOffset), - VectorOffset controlDepsForVarOffset = default(VectorOffset), - StringOffset gradientVariableOffset = default(StringOffset), - StringOffset uiLabelExtraOffset = default(StringOffset), - Offset constantValueOffset = default(Offset)) { - builder.StartObject(13); - UIVariable.AddConstantValue(builder, constantValueOffset); - UIVariable.AddUiLabelExtra(builder, uiLabelExtraOffset); - UIVariable.AddGradientVariable(builder, gradientVariableOffset); - UIVariable.AddControlDepsForVar(builder, controlDepsForVarOffset); - UIVariable.AddControlDepsForOp(builder, controlDepsForOpOffset); - UIVariable.AddInputsForOp(builder, inputsForOpOffset); - UIVariable.AddOutputOfOp(builder, outputOfOpOffset); - UIVariable.AddControlDeps(builder, controlDepsOffset); - UIVariable.AddShape(builder, shapeOffset); - UIVariable.AddName(builder, nameOffset); - UIVariable.AddId(builder, idOffset); - UIVariable.AddDatatype(builder, datatype); - UIVariable.AddType(builder, type); - return UIVariable.EndUIVariable(builder); - } - - public static void StartUIVariable(FlatBufferBuilder builder) { builder.StartObject(13); } - public static void AddId(FlatBufferBuilder builder, Offset idOffset) { builder.AddOffset(0, idOffset.Value, 0); } - public static void AddName(FlatBufferBuilder builder, StringOffset nameOffset) { builder.AddOffset(1, nameOffset.Value, 0); } - public static void AddType(FlatBufferBuilder builder, VarType type) { builder.AddSbyte(2, (sbyte)type, 0); } - public static void AddDatatype(FlatBufferBuilder builder, DType datatype) { builder.AddSbyte(3, (sbyte)datatype, 0); } - public static void AddShape(FlatBufferBuilder builder, VectorOffset shapeOffset) { builder.AddOffset(4, shapeOffset.Value, 0); } - public static VectorOffset CreateShapeVector(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); for (int i = data.Length - 1; i >= 0; i--) builder.AddLong(data[i]); return builder.EndVector(); } - public static VectorOffset CreateShapeVectorBlock(FlatBufferBuilder builder, long[] data) { builder.StartVector(8, data.Length, 8); builder.Add(data); return builder.EndVector(); } - public static void StartShapeVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(8, numElems, 8); } - public static void AddControlDeps(FlatBufferBuilder builder, VectorOffset controlDepsOffset) { builder.AddOffset(5, controlDepsOffset.Value, 0); } - public static VectorOffset CreateControlDepsVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepsVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddOutputOfOp(FlatBufferBuilder builder, StringOffset outputOfOpOffset) { builder.AddOffset(6, outputOfOpOffset.Value, 0); } - public static void AddInputsForOp(FlatBufferBuilder builder, VectorOffset inputsForOpOffset) { builder.AddOffset(7, inputsForOpOffset.Value, 0); } - public static VectorOffset CreateInputsForOpVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateInputsForOpVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartInputsForOpVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddControlDepsForOp(FlatBufferBuilder builder, VectorOffset controlDepsForOpOffset) { builder.AddOffset(8, controlDepsForOpOffset.Value, 0); } - public static VectorOffset CreateControlDepsForOpVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepsForOpVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepsForOpVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddControlDepsForVar(FlatBufferBuilder builder, VectorOffset controlDepsForVarOffset) { builder.AddOffset(9, controlDepsForVarOffset.Value, 0); } - public static VectorOffset CreateControlDepsForVarVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateControlDepsForVarVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartControlDepsForVarVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddGradientVariable(FlatBufferBuilder builder, StringOffset gradientVariableOffset) { builder.AddOffset(10, gradientVariableOffset.Value, 0); } - public static void AddUiLabelExtra(FlatBufferBuilder builder, StringOffset uiLabelExtraOffset) { builder.AddOffset(11, uiLabelExtraOffset.Value, 0); } - public static void AddConstantValue(FlatBufferBuilder builder, Offset constantValueOffset) { builder.AddOffset(12, constantValueOffset.Value, 0); } - public static Offset EndUIVariable(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIVariable.java b/libnd4j/include/graph/generated/nd4j/graph/UIVariable.java deleted file mode 100644 index 97ffb8c24..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIVariable.java +++ /dev/null @@ -1,108 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UIVariable extends Table { - public static UIVariable getRootAsUIVariable(ByteBuffer _bb) { return getRootAsUIVariable(_bb, new UIVariable()); } - public static UIVariable getRootAsUIVariable(ByteBuffer _bb, UIVariable obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UIVariable __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public IntPair id() { return id(new IntPair()); } - public IntPair id(IntPair obj) { int o = __offset(4); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - public String name() { int o = __offset(6); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer nameAsByteBuffer() { return __vector_as_bytebuffer(6, 1); } - public ByteBuffer nameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 6, 1); } - public byte type() { int o = __offset(8); return o != 0 ? bb.get(o + bb_pos) : 0; } - public byte datatype() { int o = __offset(10); return o != 0 ? bb.get(o + bb_pos) : 0; } - public long shape(int j) { int o = __offset(12); return o != 0 ? bb.getLong(__vector(o) + j * 8) : 0; } - public int shapeLength() { int o = __offset(12); return o != 0 ? __vector_len(o) : 0; } - public ByteBuffer shapeAsByteBuffer() { return __vector_as_bytebuffer(12, 8); } - public ByteBuffer shapeInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 12, 8); } - public String controlDeps(int j) { int o = __offset(14); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepsLength() { int o = __offset(14); return o != 0 ? __vector_len(o) : 0; } - public String outputOfOp() { int o = __offset(16); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer outputOfOpAsByteBuffer() { return __vector_as_bytebuffer(16, 1); } - public ByteBuffer outputOfOpInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 16, 1); } - public String inputsForOp(int j) { int o = __offset(18); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int inputsForOpLength() { int o = __offset(18); return o != 0 ? __vector_len(o) : 0; } - public String controlDepsForOp(int j) { int o = __offset(20); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepsForOpLength() { int o = __offset(20); return o != 0 ? __vector_len(o) : 0; } - public String controlDepsForVar(int j) { int o = __offset(22); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int controlDepsForVarLength() { int o = __offset(22); return o != 0 ? __vector_len(o) : 0; } - public String gradientVariable() { int o = __offset(24); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer gradientVariableAsByteBuffer() { return __vector_as_bytebuffer(24, 1); } - public ByteBuffer gradientVariableInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 24, 1); } - public String uiLabelExtra() { int o = __offset(26); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer uiLabelExtraAsByteBuffer() { return __vector_as_bytebuffer(26, 1); } - public ByteBuffer uiLabelExtraInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 26, 1); } - public FlatArray constantValue() { return constantValue(new FlatArray()); } - public FlatArray constantValue(FlatArray obj) { int o = __offset(28); return o != 0 ? obj.__assign(__indirect(o + bb_pos), bb) : null; } - - public static int createUIVariable(FlatBufferBuilder builder, - int idOffset, - int nameOffset, - byte type, - byte datatype, - int shapeOffset, - int controlDepsOffset, - int outputOfOpOffset, - int inputsForOpOffset, - int controlDepsForOpOffset, - int controlDepsForVarOffset, - int gradientVariableOffset, - int uiLabelExtraOffset, - int constantValueOffset) { - builder.startObject(13); - UIVariable.addConstantValue(builder, constantValueOffset); - UIVariable.addUiLabelExtra(builder, uiLabelExtraOffset); - UIVariable.addGradientVariable(builder, gradientVariableOffset); - UIVariable.addControlDepsForVar(builder, controlDepsForVarOffset); - UIVariable.addControlDepsForOp(builder, controlDepsForOpOffset); - UIVariable.addInputsForOp(builder, inputsForOpOffset); - UIVariable.addOutputOfOp(builder, outputOfOpOffset); - UIVariable.addControlDeps(builder, controlDepsOffset); - UIVariable.addShape(builder, shapeOffset); - UIVariable.addName(builder, nameOffset); - UIVariable.addId(builder, idOffset); - UIVariable.addDatatype(builder, datatype); - UIVariable.addType(builder, type); - return UIVariable.endUIVariable(builder); - } - - public static void startUIVariable(FlatBufferBuilder builder) { builder.startObject(13); } - public static void addId(FlatBufferBuilder builder, int idOffset) { builder.addOffset(0, idOffset, 0); } - public static void addName(FlatBufferBuilder builder, int nameOffset) { builder.addOffset(1, nameOffset, 0); } - public static void addType(FlatBufferBuilder builder, byte type) { builder.addByte(2, type, 0); } - public static void addDatatype(FlatBufferBuilder builder, byte datatype) { builder.addByte(3, datatype, 0); } - public static void addShape(FlatBufferBuilder builder, int shapeOffset) { builder.addOffset(4, shapeOffset, 0); } - public static int createShapeVector(FlatBufferBuilder builder, long[] data) { builder.startVector(8, data.length, 8); for (int i = data.length - 1; i >= 0; i--) builder.addLong(data[i]); return builder.endVector(); } - public static void startShapeVector(FlatBufferBuilder builder, int numElems) { builder.startVector(8, numElems, 8); } - public static void addControlDeps(FlatBufferBuilder builder, int controlDepsOffset) { builder.addOffset(5, controlDepsOffset, 0); } - public static int createControlDepsVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepsVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addOutputOfOp(FlatBufferBuilder builder, int outputOfOpOffset) { builder.addOffset(6, outputOfOpOffset, 0); } - public static void addInputsForOp(FlatBufferBuilder builder, int inputsForOpOffset) { builder.addOffset(7, inputsForOpOffset, 0); } - public static int createInputsForOpVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startInputsForOpVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addControlDepsForOp(FlatBufferBuilder builder, int controlDepsForOpOffset) { builder.addOffset(8, controlDepsForOpOffset, 0); } - public static int createControlDepsForOpVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepsForOpVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addControlDepsForVar(FlatBufferBuilder builder, int controlDepsForVarOffset) { builder.addOffset(9, controlDepsForVarOffset, 0); } - public static int createControlDepsForVarVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startControlDepsForVarVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addGradientVariable(FlatBufferBuilder builder, int gradientVariableOffset) { builder.addOffset(10, gradientVariableOffset, 0); } - public static void addUiLabelExtra(FlatBufferBuilder builder, int uiLabelExtraOffset) { builder.addOffset(11, uiLabelExtraOffset, 0); } - public static void addConstantValue(FlatBufferBuilder builder, int constantValueOffset) { builder.addOffset(12, constantValueOffset, 0); } - public static int endUIVariable(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UIVariable.py b/libnd4j/include/graph/generated/nd4j/graph/UIVariable.py deleted file mode 100644 index 44bafdaf0..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UIVariable.py +++ /dev/null @@ -1,200 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UIVariable(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUIVariable(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UIVariable() - x.Init(buf, n + offset) - return x - - # UIVariable - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UIVariable - def Id(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .IntPair import IntPair - obj = IntPair() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UIVariable - def Name(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # UIVariable - def Type(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # UIVariable - def Datatype(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(10)) - if o != 0: - return self._tab.Get(flatbuffers.number_types.Int8Flags, o + self._tab.Pos) - return 0 - - # UIVariable - def Shape(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.Get(flatbuffers.number_types.Int64Flags, a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 8)) - return 0 - - # UIVariable - def ShapeAsNumpy(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.GetVectorAsNumpy(flatbuffers.number_types.Int64Flags, o) - return 0 - - # UIVariable - def ShapeLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(12)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIVariable - def ControlDeps(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIVariable - def ControlDepsLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(14)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIVariable - def OutputOfOp(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(16)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # UIVariable - def InputsForOp(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIVariable - def InputsForOpLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(18)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIVariable - def ControlDepsForOp(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIVariable - def ControlDepsForOpLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(20)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIVariable - def ControlDepsForVar(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UIVariable - def ControlDepsForVarLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(22)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UIVariable - def GradientVariable(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(24)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # UIVariable - def UiLabelExtra(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(26)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # UIVariable - def ConstantValue(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(28)) - if o != 0: - x = self._tab.Indirect(o + self._tab.Pos) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - -def UIVariableStart(builder): builder.StartObject(13) -def UIVariableAddId(builder, id): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(id), 0) -def UIVariableAddName(builder, name): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(name), 0) -def UIVariableAddType(builder, type): builder.PrependInt8Slot(2, type, 0) -def UIVariableAddDatatype(builder, datatype): builder.PrependInt8Slot(3, datatype, 0) -def UIVariableAddShape(builder, shape): builder.PrependUOffsetTRelativeSlot(4, flatbuffers.number_types.UOffsetTFlags.py_type(shape), 0) -def UIVariableStartShapeVector(builder, numElems): return builder.StartVector(8, numElems, 8) -def UIVariableAddControlDeps(builder, controlDeps): builder.PrependUOffsetTRelativeSlot(5, flatbuffers.number_types.UOffsetTFlags.py_type(controlDeps), 0) -def UIVariableStartControlDepsVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIVariableAddOutputOfOp(builder, outputOfOp): builder.PrependUOffsetTRelativeSlot(6, flatbuffers.number_types.UOffsetTFlags.py_type(outputOfOp), 0) -def UIVariableAddInputsForOp(builder, inputsForOp): builder.PrependUOffsetTRelativeSlot(7, flatbuffers.number_types.UOffsetTFlags.py_type(inputsForOp), 0) -def UIVariableStartInputsForOpVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIVariableAddControlDepsForOp(builder, controlDepsForOp): builder.PrependUOffsetTRelativeSlot(8, flatbuffers.number_types.UOffsetTFlags.py_type(controlDepsForOp), 0) -def UIVariableStartControlDepsForOpVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIVariableAddControlDepsForVar(builder, controlDepsForVar): builder.PrependUOffsetTRelativeSlot(9, flatbuffers.number_types.UOffsetTFlags.py_type(controlDepsForVar), 0) -def UIVariableStartControlDepsForVarVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UIVariableAddGradientVariable(builder, gradientVariable): builder.PrependUOffsetTRelativeSlot(10, flatbuffers.number_types.UOffsetTFlags.py_type(gradientVariable), 0) -def UIVariableAddUiLabelExtra(builder, uiLabelExtra): builder.PrependUOffsetTRelativeSlot(11, flatbuffers.number_types.UOffsetTFlags.py_type(uiLabelExtra), 0) -def UIVariableAddConstantValue(builder, constantValue): builder.PrependUOffsetTRelativeSlot(12, flatbuffers.number_types.UOffsetTFlags.py_type(constantValue), 0) -def UIVariableEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.cs b/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.cs deleted file mode 100644 index 8cb6e07f0..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.cs +++ /dev/null @@ -1,60 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -using global::System; -using global::FlatBuffers; - -public struct UpdaterState : IFlatbufferObject -{ - private Table __p; - public ByteBuffer ByteBuffer { get { return __p.bb; } } - public static UpdaterState GetRootAsUpdaterState(ByteBuffer _bb) { return GetRootAsUpdaterState(_bb, new UpdaterState()); } - public static UpdaterState GetRootAsUpdaterState(ByteBuffer _bb, UpdaterState obj) { return (obj.__assign(_bb.GetInt(_bb.Position) + _bb.Position, _bb)); } - public void __init(int _i, ByteBuffer _bb) { __p.bb_pos = _i; __p.bb = _bb; } - public UpdaterState __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public string ParamName { get { int o = __p.__offset(4); return o != 0 ? __p.__string(o + __p.bb_pos) : null; } } -#if ENABLE_SPAN_T - public Span GetParamNameBytes() { return __p.__vector_as_span(4); } -#else - public ArraySegment? GetParamNameBytes() { return __p.__vector_as_arraysegment(4); } -#endif - public byte[] GetParamNameArray() { return __p.__vector_as_array(4); } - public string UpdaterStateKeys(int j) { int o = __p.__offset(6); return o != 0 ? __p.__string(__p.__vector(o) + j * 4) : null; } - public int UpdaterStateKeysLength { get { int o = __p.__offset(6); return o != 0 ? __p.__vector_len(o) : 0; } } - public FlatArray? UpdaterStateValues(int j) { int o = __p.__offset(8); return o != 0 ? (FlatArray?)(new FlatArray()).__assign(__p.__indirect(__p.__vector(o) + j * 4), __p.bb) : null; } - public int UpdaterStateValuesLength { get { int o = __p.__offset(8); return o != 0 ? __p.__vector_len(o) : 0; } } - - public static Offset CreateUpdaterState(FlatBufferBuilder builder, - StringOffset paramNameOffset = default(StringOffset), - VectorOffset updaterStateKeysOffset = default(VectorOffset), - VectorOffset updaterStateValuesOffset = default(VectorOffset)) { - builder.StartObject(3); - UpdaterState.AddUpdaterStateValues(builder, updaterStateValuesOffset); - UpdaterState.AddUpdaterStateKeys(builder, updaterStateKeysOffset); - UpdaterState.AddParamName(builder, paramNameOffset); - return UpdaterState.EndUpdaterState(builder); - } - - public static void StartUpdaterState(FlatBufferBuilder builder) { builder.StartObject(3); } - public static void AddParamName(FlatBufferBuilder builder, StringOffset paramNameOffset) { builder.AddOffset(0, paramNameOffset.Value, 0); } - public static void AddUpdaterStateKeys(FlatBufferBuilder builder, VectorOffset updaterStateKeysOffset) { builder.AddOffset(1, updaterStateKeysOffset.Value, 0); } - public static VectorOffset CreateUpdaterStateKeysVector(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateUpdaterStateKeysVectorBlock(FlatBufferBuilder builder, StringOffset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartUpdaterStateKeysVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static void AddUpdaterStateValues(FlatBufferBuilder builder, VectorOffset updaterStateValuesOffset) { builder.AddOffset(2, updaterStateValuesOffset.Value, 0); } - public static VectorOffset CreateUpdaterStateValuesVector(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); for (int i = data.Length - 1; i >= 0; i--) builder.AddOffset(data[i].Value); return builder.EndVector(); } - public static VectorOffset CreateUpdaterStateValuesVectorBlock(FlatBufferBuilder builder, Offset[] data) { builder.StartVector(4, data.Length, 4); builder.Add(data); return builder.EndVector(); } - public static void StartUpdaterStateValuesVector(FlatBufferBuilder builder, int numElems) { builder.StartVector(4, numElems, 4); } - public static Offset EndUpdaterState(FlatBufferBuilder builder) { - int o = builder.EndObject(); - return new Offset(o); - } -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.java b/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.java deleted file mode 100644 index 76868354c..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.java +++ /dev/null @@ -1,50 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -import java.nio.*; -import java.lang.*; -import java.util.*; -import com.google.flatbuffers.*; - -@SuppressWarnings("unused") -public final class UpdaterState extends Table { - public static UpdaterState getRootAsUpdaterState(ByteBuffer _bb) { return getRootAsUpdaterState(_bb, new UpdaterState()); } - public static UpdaterState getRootAsUpdaterState(ByteBuffer _bb, UpdaterState obj) { _bb.order(ByteOrder.LITTLE_ENDIAN); return (obj.__assign(_bb.getInt(_bb.position()) + _bb.position(), _bb)); } - public void __init(int _i, ByteBuffer _bb) { bb_pos = _i; bb = _bb; } - public UpdaterState __assign(int _i, ByteBuffer _bb) { __init(_i, _bb); return this; } - - public String paramName() { int o = __offset(4); return o != 0 ? __string(o + bb_pos) : null; } - public ByteBuffer paramNameAsByteBuffer() { return __vector_as_bytebuffer(4, 1); } - public ByteBuffer paramNameInByteBuffer(ByteBuffer _bb) { return __vector_in_bytebuffer(_bb, 4, 1); } - public String updaterStateKeys(int j) { int o = __offset(6); return o != 0 ? __string(__vector(o) + j * 4) : null; } - public int updaterStateKeysLength() { int o = __offset(6); return o != 0 ? __vector_len(o) : 0; } - public FlatArray updaterStateValues(int j) { return updaterStateValues(new FlatArray(), j); } - public FlatArray updaterStateValues(FlatArray obj, int j) { int o = __offset(8); return o != 0 ? obj.__assign(__indirect(__vector(o) + j * 4), bb) : null; } - public int updaterStateValuesLength() { int o = __offset(8); return o != 0 ? __vector_len(o) : 0; } - - public static int createUpdaterState(FlatBufferBuilder builder, - int paramNameOffset, - int updaterStateKeysOffset, - int updaterStateValuesOffset) { - builder.startObject(3); - UpdaterState.addUpdaterStateValues(builder, updaterStateValuesOffset); - UpdaterState.addUpdaterStateKeys(builder, updaterStateKeysOffset); - UpdaterState.addParamName(builder, paramNameOffset); - return UpdaterState.endUpdaterState(builder); - } - - public static void startUpdaterState(FlatBufferBuilder builder) { builder.startObject(3); } - public static void addParamName(FlatBufferBuilder builder, int paramNameOffset) { builder.addOffset(0, paramNameOffset, 0); } - public static void addUpdaterStateKeys(FlatBufferBuilder builder, int updaterStateKeysOffset) { builder.addOffset(1, updaterStateKeysOffset, 0); } - public static int createUpdaterStateKeysVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startUpdaterStateKeysVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static void addUpdaterStateValues(FlatBufferBuilder builder, int updaterStateValuesOffset) { builder.addOffset(2, updaterStateValuesOffset, 0); } - public static int createUpdaterStateValuesVector(FlatBufferBuilder builder, int[] data) { builder.startVector(4, data.length, 4); for (int i = data.length - 1; i >= 0; i--) builder.addOffset(data[i]); return builder.endVector(); } - public static void startUpdaterStateValuesVector(FlatBufferBuilder builder, int numElems) { builder.startVector(4, numElems, 4); } - public static int endUpdaterState(FlatBufferBuilder builder) { - int o = builder.endObject(); - return o; - } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.py b/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.py deleted file mode 100644 index 3765d8f85..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/UpdaterState.py +++ /dev/null @@ -1,83 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -import flatbuffers - -class UpdaterState(object): - __slots__ = ['_tab'] - - @classmethod - def GetRootAsUpdaterState(cls, buf, offset): - n = flatbuffers.encode.Get(flatbuffers.packer.uoffset, buf, offset) - x = UpdaterState() - x.Init(buf, n + offset) - return x - - # UpdaterState - def Init(self, buf, pos): - self._tab = flatbuffers.table.Table(buf, pos) - - # UpdaterState - def ParamName(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(4)) - if o != 0: - return self._tab.String(o + self._tab.Pos) - return None - - # UpdaterState - def UpdaterStateKeys(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - a = self._tab.Vector(o) - return self._tab.String(a + flatbuffers.number_types.UOffsetTFlags.py_type(j * 4)) - return "" - - # UpdaterState - def UpdaterStateKeysLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(6)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - - # UpdaterState - def UpdaterStateValues(self, j): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - x = self._tab.Vector(o) - x += flatbuffers.number_types.UOffsetTFlags.py_type(j) * 4 - x = self._tab.Indirect(x) - from .FlatArray import FlatArray - obj = FlatArray() - obj.Init(self._tab.Bytes, x) - return obj - return None - - # UpdaterState - def UpdaterStateValuesLength(self): - o = flatbuffers.number_types.UOffsetTFlags.py_type(self._tab.Offset(8)) - if o != 0: - return self._tab.VectorLen(o) - return 0 - -def UpdaterStateStart(builder): builder.StartObject(3) -def UpdaterStateAddParamName(builder, paramName): builder.PrependUOffsetTRelativeSlot(0, flatbuffers.number_types.UOffsetTFlags.py_type(paramName), 0) -def UpdaterStateAddUpdaterStateKeys(builder, updaterStateKeys): builder.PrependUOffsetTRelativeSlot(1, flatbuffers.number_types.UOffsetTFlags.py_type(updaterStateKeys), 0) -def UpdaterStateStartUpdaterStateKeysVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UpdaterStateAddUpdaterStateValues(builder, updaterStateValues): builder.PrependUOffsetTRelativeSlot(2, flatbuffers.number_types.UOffsetTFlags.py_type(updaterStateValues), 0) -def UpdaterStateStartUpdaterStateValuesVector(builder, numElems): return builder.StartVector(4, numElems, 4) -def UpdaterStateEnd(builder): return builder.EndObject() diff --git a/libnd4j/include/graph/generated/nd4j/graph/VarType.cs b/libnd4j/include/graph/generated/nd4j/graph/VarType.cs deleted file mode 100644 index 4649bfef6..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/VarType.cs +++ /dev/null @@ -1,17 +0,0 @@ -// -// automatically generated by the FlatBuffers compiler, do not modify -// - -namespace sd.graph -{ - -public enum VarType : sbyte -{ - VARIABLE = 0, - CONSTANT = 1, - ARRAY = 2, - PLACEHOLDER = 3, -}; - - -} diff --git a/libnd4j/include/graph/generated/nd4j/graph/VarType.java b/libnd4j/include/graph/generated/nd4j/graph/VarType.java deleted file mode 100644 index 14937cd76..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/VarType.java +++ /dev/null @@ -1,16 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - -package nd4j.graph; - -public final class VarType { - private VarType() { } - public static final byte VARIABLE = 0; - public static final byte CONSTANT = 1; - public static final byte ARRAY = 2; - public static final byte PLACEHOLDER = 3; - - public static final String[] names = { "VARIABLE", "CONSTANT", "ARRAY", "PLACEHOLDER", }; - - public static String name(int e) { return names[e]; } -} - diff --git a/libnd4j/include/graph/generated/nd4j/graph/VarType.py b/libnd4j/include/graph/generated/nd4j/graph/VarType.py deleted file mode 100644 index 84dc68522..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/VarType.py +++ /dev/null @@ -1,24 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - -class VarType(object): - VARIABLE = 0 - CONSTANT = 1 - ARRAY = 2 - PLACEHOLDER = 3 - diff --git a/libnd4j/include/graph/generated/nd4j/graph/__init__.py b/libnd4j/include/graph/generated/nd4j/graph/__init__.py deleted file mode 100644 index ecf2a1c25..000000000 --- a/libnd4j/include/graph/generated/nd4j/graph/__init__.py +++ /dev/null @@ -1,18 +0,0 @@ -# /* ****************************************************************************** -# * -# * -# * This program and the accompanying materials are made available under the -# * terms of the Apache License, Version 2.0 which is available at -# * https://www.apache.org/licenses/LICENSE-2.0. -# * -# * See the NOTICE file distributed with this work for additional -# * information regarding copyright ownership. -# * Unless required by applicable law or agreed to in writing, software -# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT -# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the -# * License for the specific language governing permissions and limitations -# * under the License. -# * -# * SPDX-License-Identifier: Apache-2.0 -# ******************************************************************************/ - diff --git a/libnd4j/include/graph/generated/node_generated.h b/libnd4j/include/graph/generated/node_generated.h deleted file mode 100644 index a39f2490c..000000000 --- a/libnd4j/include/graph/generated/node_generated.h +++ /dev/null @@ -1,387 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_NODE_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_NODE_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -#include "array_generated.h" -#include "properties_generated.h" -#include "utils_generated.h" - -namespace sd { -namespace graph { - -struct FlatNode; - -struct FlatNode FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4, - VT_NAME = 6, - VT_OPTYPE = 8, - VT_OPNUM = 10, - VT_PROPERTIES = 12, - VT_INPUT = 14, - VT_INPUTPAIRED = 16, - VT_OUTPUT = 18, - VT_EXTRAPARAMS = 20, - VT_EXTRAINTEGER = 22, - VT_EXTRABOOLS = 24, - VT_DIMENSIONS = 26, - VT_DEVICE = 28, - VT_SCOPE_ID = 30, - VT_SCOPE_NAME = 32, - VT_OUTPUTNAMES = 34, - VT_OPNAME = 36, - VT_OUTPUTTYPES = 38, - VT_SCALAR = 40, - VT_CONTROLDEPS = 42, - VT_VARCONTROLDEPS = 44, - VT_CONTROLDEPFOR = 46, - VT_EXTRATYPES = 48 - }; - int32_t id() const { - return GetField(VT_ID, 0); - } - const flatbuffers::String *name() const { - return GetPointer(VT_NAME); - } - OpType opType() const { - return static_cast(GetField(VT_OPTYPE, 0)); - } - int64_t opNum() const { - return GetField(VT_OPNUM, 0); - } - const flatbuffers::Vector> *properties() const { - return GetPointer> *>(VT_PROPERTIES); - } - const flatbuffers::Vector *input() const { - return GetPointer *>(VT_INPUT); - } - const flatbuffers::Vector> *inputPaired() const { - return GetPointer> *>(VT_INPUTPAIRED); - } - const flatbuffers::Vector *output() const { - return GetPointer *>(VT_OUTPUT); - } - const flatbuffers::Vector *extraParams() const { - return GetPointer *>(VT_EXTRAPARAMS); - } - const flatbuffers::Vector *extraInteger() const { - return GetPointer *>(VT_EXTRAINTEGER); - } - const flatbuffers::Vector *extraBools() const { - return GetPointer *>(VT_EXTRABOOLS); - } - const flatbuffers::Vector *dimensions() const { - return GetPointer *>(VT_DIMENSIONS); - } - int32_t device() const { - return GetField(VT_DEVICE, 0); - } - int32_t scope_id() const { - return GetField(VT_SCOPE_ID, 0); - } - const flatbuffers::String *scope_name() const { - return GetPointer(VT_SCOPE_NAME); - } - const flatbuffers::Vector> *outputNames() const { - return GetPointer> *>(VT_OUTPUTNAMES); - } - const flatbuffers::String *opName() const { - return GetPointer(VT_OPNAME); - } - const flatbuffers::Vector *outputTypes() const { - return GetPointer *>(VT_OUTPUTTYPES); - } - const FlatArray *scalar() const { - return GetPointer(VT_SCALAR); - } - const flatbuffers::Vector> *controlDeps() const { - return GetPointer> *>(VT_CONTROLDEPS); - } - const flatbuffers::Vector> *varControlDeps() const { - return GetPointer> *>(VT_VARCONTROLDEPS); - } - const flatbuffers::Vector> *controlDepFor() const { - return GetPointer> *>(VT_CONTROLDEPFOR); - } - const flatbuffers::Vector *extraTypes() const { - return GetPointer *>(VT_EXTRATYPES); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_ID) && - VerifyOffset(verifier, VT_NAME) && - verifier.VerifyString(name()) && - VerifyField(verifier, VT_OPTYPE) && - VerifyField(verifier, VT_OPNUM) && - VerifyOffset(verifier, VT_PROPERTIES) && - verifier.VerifyVector(properties()) && - verifier.VerifyVectorOfTables(properties()) && - VerifyOffset(verifier, VT_INPUT) && - verifier.VerifyVector(input()) && - VerifyOffset(verifier, VT_INPUTPAIRED) && - verifier.VerifyVector(inputPaired()) && - verifier.VerifyVectorOfTables(inputPaired()) && - VerifyOffset(verifier, VT_OUTPUT) && - verifier.VerifyVector(output()) && - VerifyOffset(verifier, VT_EXTRAPARAMS) && - verifier.VerifyVector(extraParams()) && - VerifyOffset(verifier, VT_EXTRAINTEGER) && - verifier.VerifyVector(extraInteger()) && - VerifyOffset(verifier, VT_EXTRABOOLS) && - verifier.VerifyVector(extraBools()) && - VerifyOffset(verifier, VT_DIMENSIONS) && - verifier.VerifyVector(dimensions()) && - VerifyField(verifier, VT_DEVICE) && - VerifyField(verifier, VT_SCOPE_ID) && - VerifyOffset(verifier, VT_SCOPE_NAME) && - verifier.VerifyString(scope_name()) && - VerifyOffset(verifier, VT_OUTPUTNAMES) && - verifier.VerifyVector(outputNames()) && - verifier.VerifyVectorOfStrings(outputNames()) && - VerifyOffset(verifier, VT_OPNAME) && - verifier.VerifyString(opName()) && - VerifyOffset(verifier, VT_OUTPUTTYPES) && - verifier.VerifyVector(outputTypes()) && - VerifyOffset(verifier, VT_SCALAR) && - verifier.VerifyTable(scalar()) && - VerifyOffset(verifier, VT_CONTROLDEPS) && - verifier.VerifyVector(controlDeps()) && - verifier.VerifyVectorOfStrings(controlDeps()) && - VerifyOffset(verifier, VT_VARCONTROLDEPS) && - verifier.VerifyVector(varControlDeps()) && - verifier.VerifyVectorOfStrings(varControlDeps()) && - VerifyOffset(verifier, VT_CONTROLDEPFOR) && - verifier.VerifyVector(controlDepFor()) && - verifier.VerifyVectorOfStrings(controlDepFor()) && - VerifyOffset(verifier, VT_EXTRATYPES) && - verifier.VerifyVector(extraTypes()) && - verifier.EndTable(); - } -}; - -struct FlatNodeBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(int32_t id) { - fbb_.AddElement(FlatNode::VT_ID, id, 0); - } - void add_name(flatbuffers::Offset name) { - fbb_.AddOffset(FlatNode::VT_NAME, name); - } - void add_opType(OpType opType) { - fbb_.AddElement(FlatNode::VT_OPTYPE, static_cast(opType), 0); - } - void add_opNum(int64_t opNum) { - fbb_.AddElement(FlatNode::VT_OPNUM, opNum, 0); - } - void add_properties(flatbuffers::Offset>> properties) { - fbb_.AddOffset(FlatNode::VT_PROPERTIES, properties); - } - void add_input(flatbuffers::Offset> input) { - fbb_.AddOffset(FlatNode::VT_INPUT, input); - } - void add_inputPaired(flatbuffers::Offset>> inputPaired) { - fbb_.AddOffset(FlatNode::VT_INPUTPAIRED, inputPaired); - } - void add_output(flatbuffers::Offset> output) { - fbb_.AddOffset(FlatNode::VT_OUTPUT, output); - } - void add_extraParams(flatbuffers::Offset> extraParams) { - fbb_.AddOffset(FlatNode::VT_EXTRAPARAMS, extraParams); - } - void add_extraInteger(flatbuffers::Offset> extraInteger) { - fbb_.AddOffset(FlatNode::VT_EXTRAINTEGER, extraInteger); - } - void add_extraBools(flatbuffers::Offset> extraBools) { - fbb_.AddOffset(FlatNode::VT_EXTRABOOLS, extraBools); - } - void add_dimensions(flatbuffers::Offset> dimensions) { - fbb_.AddOffset(FlatNode::VT_DIMENSIONS, dimensions); - } - void add_device(int32_t device) { - fbb_.AddElement(FlatNode::VT_DEVICE, device, 0); - } - void add_scope_id(int32_t scope_id) { - fbb_.AddElement(FlatNode::VT_SCOPE_ID, scope_id, 0); - } - void add_scope_name(flatbuffers::Offset scope_name) { - fbb_.AddOffset(FlatNode::VT_SCOPE_NAME, scope_name); - } - void add_outputNames(flatbuffers::Offset>> outputNames) { - fbb_.AddOffset(FlatNode::VT_OUTPUTNAMES, outputNames); - } - void add_opName(flatbuffers::Offset opName) { - fbb_.AddOffset(FlatNode::VT_OPNAME, opName); - } - void add_outputTypes(flatbuffers::Offset> outputTypes) { - fbb_.AddOffset(FlatNode::VT_OUTPUTTYPES, outputTypes); - } - void add_scalar(flatbuffers::Offset scalar) { - fbb_.AddOffset(FlatNode::VT_SCALAR, scalar); - } - void add_controlDeps(flatbuffers::Offset>> controlDeps) { - fbb_.AddOffset(FlatNode::VT_CONTROLDEPS, controlDeps); - } - void add_varControlDeps(flatbuffers::Offset>> varControlDeps) { - fbb_.AddOffset(FlatNode::VT_VARCONTROLDEPS, varControlDeps); - } - void add_controlDepFor(flatbuffers::Offset>> controlDepFor) { - fbb_.AddOffset(FlatNode::VT_CONTROLDEPFOR, controlDepFor); - } - void add_extraTypes(flatbuffers::Offset> extraTypes) { - fbb_.AddOffset(FlatNode::VT_EXTRATYPES, extraTypes); - } - explicit FlatNodeBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatNodeBuilder &operator=(const FlatNodeBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatNode( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t id = 0, - flatbuffers::Offset name = 0, - OpType opType = OpType_TRANSFORM_FLOAT, - int64_t opNum = 0, - flatbuffers::Offset>> properties = 0, - flatbuffers::Offset> input = 0, - flatbuffers::Offset>> inputPaired = 0, - flatbuffers::Offset> output = 0, - flatbuffers::Offset> extraParams = 0, - flatbuffers::Offset> extraInteger = 0, - flatbuffers::Offset> extraBools = 0, - flatbuffers::Offset> dimensions = 0, - int32_t device = 0, - int32_t scope_id = 0, - flatbuffers::Offset scope_name = 0, - flatbuffers::Offset>> outputNames = 0, - flatbuffers::Offset opName = 0, - flatbuffers::Offset> outputTypes = 0, - flatbuffers::Offset scalar = 0, - flatbuffers::Offset>> controlDeps = 0, - flatbuffers::Offset>> varControlDeps = 0, - flatbuffers::Offset>> controlDepFor = 0, - flatbuffers::Offset> extraTypes = 0) { - FlatNodeBuilder builder_(_fbb); - builder_.add_opNum(opNum); - builder_.add_extraTypes(extraTypes); - builder_.add_controlDepFor(controlDepFor); - builder_.add_varControlDeps(varControlDeps); - builder_.add_controlDeps(controlDeps); - builder_.add_scalar(scalar); - builder_.add_outputTypes(outputTypes); - builder_.add_opName(opName); - builder_.add_outputNames(outputNames); - builder_.add_scope_name(scope_name); - builder_.add_scope_id(scope_id); - builder_.add_device(device); - builder_.add_dimensions(dimensions); - builder_.add_extraBools(extraBools); - builder_.add_extraInteger(extraInteger); - builder_.add_extraParams(extraParams); - builder_.add_output(output); - builder_.add_inputPaired(inputPaired); - builder_.add_input(input); - builder_.add_properties(properties); - builder_.add_name(name); - builder_.add_id(id); - builder_.add_opType(opType); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatNodeDirect( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t id = 0, - const char *name = nullptr, - OpType opType = OpType_TRANSFORM_FLOAT, - int64_t opNum = 0, - const std::vector> *properties = nullptr, - const std::vector *input = nullptr, - const std::vector> *inputPaired = nullptr, - const std::vector *output = nullptr, - const std::vector *extraParams = nullptr, - const std::vector *extraInteger = nullptr, - const std::vector *extraBools = nullptr, - const std::vector *dimensions = nullptr, - int32_t device = 0, - int32_t scope_id = 0, - const char *scope_name = nullptr, - const std::vector> *outputNames = nullptr, - const char *opName = nullptr, - const std::vector *outputTypes = nullptr, - flatbuffers::Offset scalar = 0, - const std::vector> *controlDeps = nullptr, - const std::vector> *varControlDeps = nullptr, - const std::vector> *controlDepFor = nullptr, - const std::vector *extraTypes = nullptr) { - return sd::graph::CreateFlatNode( - _fbb, - id, - name ? _fbb.CreateString(name) : 0, - opType, - opNum, - properties ? _fbb.CreateVector>(*properties) : 0, - input ? _fbb.CreateVector(*input) : 0, - inputPaired ? _fbb.CreateVector>(*inputPaired) : 0, - output ? _fbb.CreateVector(*output) : 0, - extraParams ? _fbb.CreateVector(*extraParams) : 0, - extraInteger ? _fbb.CreateVector(*extraInteger) : 0, - extraBools ? _fbb.CreateVector(*extraBools) : 0, - dimensions ? _fbb.CreateVector(*dimensions) : 0, - device, - scope_id, - scope_name ? _fbb.CreateString(scope_name) : 0, - outputNames ? _fbb.CreateVector>(*outputNames) : 0, - opName ? _fbb.CreateString(opName) : 0, - outputTypes ? _fbb.CreateVector(*outputTypes) : 0, - scalar, - controlDeps ? _fbb.CreateVector>(*controlDeps) : 0, - varControlDeps ? _fbb.CreateVector>(*varControlDeps) : 0, - controlDepFor ? _fbb.CreateVector>(*controlDepFor) : 0, - extraTypes ? _fbb.CreateVector(*extraTypes) : 0); -} - -inline const sd::graph::FlatNode *GetFlatNode(const void *buf) { - return flatbuffers::GetRoot(buf); -} - -inline const sd::graph::FlatNode *GetSizePrefixedFlatNode(const void *buf) { - return flatbuffers::GetSizePrefixedRoot(buf); -} - -inline bool VerifyFlatNodeBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifyBuffer(nullptr); -} - -inline bool VerifySizePrefixedFlatNodeBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifySizePrefixedBuffer(nullptr); -} - -inline void FinishFlatNodeBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.Finish(root); -} - -inline void FinishSizePrefixedFlatNodeBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.FinishSizePrefixed(root); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_NODE_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/node_generated.js b/libnd4j/include/graph/generated/node_generated.js deleted file mode 100644 index a750b3752..000000000 --- a/libnd4j/include/graph/generated/node_generated.js +++ /dev/null @@ -1,947 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @constructor - */ -nd4j.graph.FlatNode = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatNode} - */ -nd4j.graph.FlatNode.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatNode=} obj - * @returns {nd4j.graph.FlatNode} - */ -nd4j.graph.FlatNode.getRootAsFlatNode = function(bb, obj) { - return (obj || new nd4j.graph.FlatNode).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.id = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.FlatNode.prototype.name = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @returns {nd4j.graph.OpType} - */ -nd4j.graph.FlatNode.prototype.opType = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? /** @type {nd4j.graph.OpType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.OpType.TRANSFORM_FLOAT; -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatNode.prototype.opNum = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatProperties=} obj - * @returns {nd4j.graph.FlatProperties} - */ -nd4j.graph.FlatNode.prototype.properties = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? (obj || new nd4j.graph.FlatProperties).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.propertiesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.input = function(index) { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.readInt32(this.bb.__vector(this.bb_pos + offset) + index * 4) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.inputLength = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int32Array} - */ -nd4j.graph.FlatNode.prototype.inputArray = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? new Int32Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {number} index - * @param {nd4j.graph.IntPair=} obj - * @returns {nd4j.graph.IntPair} - */ -nd4j.graph.FlatNode.prototype.inputPaired = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? (obj || new nd4j.graph.IntPair).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.inputPairedLength = function() { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.output = function(index) { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.readInt32(this.bb.__vector(this.bb_pos + offset) + index * 4) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.outputLength = function() { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int32Array} - */ -nd4j.graph.FlatNode.prototype.outputArray = function() { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? new Int32Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {number} index - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.extraParams = function(index) { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.readFloat64(this.bb.__vector(this.bb_pos + offset) + index * 8) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.extraParamsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Float64Array} - */ -nd4j.graph.FlatNode.prototype.extraParamsArray = function() { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? new Float64Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {number} index - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatNode.prototype.extraInteger = function(index) { - var offset = this.bb.__offset(this.bb_pos, 22); - return offset ? this.bb.readInt64(this.bb.__vector(this.bb_pos + offset) + index * 8) : this.bb.createLong(0, 0); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.extraIntegerLength = function() { - var offset = this.bb.__offset(this.bb_pos, 22); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @returns {boolean} - */ -nd4j.graph.FlatNode.prototype.extraBools = function(index) { - var offset = this.bb.__offset(this.bb_pos, 24); - return offset ? !!this.bb.readInt8(this.bb.__vector(this.bb_pos + offset) + index) : false; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.extraBoolsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 24); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int8Array} - */ -nd4j.graph.FlatNode.prototype.extraBoolsArray = function() { - var offset = this.bb.__offset(this.bb_pos, 24); - return offset ? new Int8Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {number} index - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.dimensions = function(index) { - var offset = this.bb.__offset(this.bb_pos, 26); - return offset ? this.bb.readInt32(this.bb.__vector(this.bb_pos + offset) + index * 4) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.dimensionsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 26); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int32Array} - */ -nd4j.graph.FlatNode.prototype.dimensionsArray = function() { - var offset = this.bb.__offset(this.bb_pos, 26); - return offset ? new Int32Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.device = function() { - var offset = this.bb.__offset(this.bb_pos, 28); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.scopeId = function() { - var offset = this.bb.__offset(this.bb_pos, 30); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.FlatNode.prototype.scopeName = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 32); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatNode.prototype.outputNames = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 34); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.outputNamesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 34); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.FlatNode.prototype.opName = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 36); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {number} index - * @returns {nd4j.graph.DType} - */ -nd4j.graph.FlatNode.prototype.outputTypes = function(index) { - var offset = this.bb.__offset(this.bb_pos, 38); - return offset ? /** @type {nd4j.graph.DType} */ (this.bb.readInt8(this.bb.__vector(this.bb_pos + offset) + index)) : /** @type {nd4j.graph.DType} */ (0); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.outputTypesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 38); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int8Array} - */ -nd4j.graph.FlatNode.prototype.outputTypesArray = function() { - var offset = this.bb.__offset(this.bb_pos, 38); - return offset ? new Int8Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray|null} - */ -nd4j.graph.FlatNode.prototype.scalar = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 40); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatNode.prototype.controlDeps = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 42); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.controlDepsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 42); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatNode.prototype.varControlDeps = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 44); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.varControlDepsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 44); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatNode.prototype.controlDepFor = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 46); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.controlDepForLength = function() { - var offset = this.bb.__offset(this.bb_pos, 46); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @returns {nd4j.graph.DType} - */ -nd4j.graph.FlatNode.prototype.extraTypes = function(index) { - var offset = this.bb.__offset(this.bb_pos, 48); - return offset ? /** @type {nd4j.graph.DType} */ (this.bb.readInt8(this.bb.__vector(this.bb_pos + offset) + index)) : /** @type {nd4j.graph.DType} */ (0); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatNode.prototype.extraTypesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 48); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int8Array} - */ -nd4j.graph.FlatNode.prototype.extraTypesArray = function() { - var offset = this.bb.__offset(this.bb_pos, 48); - return offset ? new Int8Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatNode.startFlatNode = function(builder) { - builder.startObject(23); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} id - */ -nd4j.graph.FlatNode.addId = function(builder, id) { - builder.addFieldInt32(0, id, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} nameOffset - */ -nd4j.graph.FlatNode.addName = function(builder, nameOffset) { - builder.addFieldOffset(1, nameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.OpType} opType - */ -nd4j.graph.FlatNode.addOpType = function(builder, opType) { - builder.addFieldInt8(2, opType, nd4j.graph.OpType.TRANSFORM_FLOAT); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} opNum - */ -nd4j.graph.FlatNode.addOpNum = function(builder, opNum) { - builder.addFieldInt64(3, opNum, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} propertiesOffset - */ -nd4j.graph.FlatNode.addProperties = function(builder, propertiesOffset) { - builder.addFieldOffset(4, propertiesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createPropertiesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startPropertiesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} inputOffset - */ -nd4j.graph.FlatNode.addInput = function(builder, inputOffset) { - builder.addFieldOffset(5, inputOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createInputVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt32(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startInputVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} inputPairedOffset - */ -nd4j.graph.FlatNode.addInputPaired = function(builder, inputPairedOffset) { - builder.addFieldOffset(6, inputPairedOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createInputPairedVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startInputPairedVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} outputOffset - */ -nd4j.graph.FlatNode.addOutput = function(builder, outputOffset) { - builder.addFieldOffset(7, outputOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createOutputVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt32(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startOutputVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} extraParamsOffset - */ -nd4j.graph.FlatNode.addExtraParams = function(builder, extraParamsOffset) { - builder.addFieldOffset(8, extraParamsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createExtraParamsVector = function(builder, data) { - builder.startVector(8, data.length, 8); - for (var i = data.length - 1; i >= 0; i--) { - builder.addFloat64(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startExtraParamsVector = function(builder, numElems) { - builder.startVector(8, numElems, 8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} extraIntegerOffset - */ -nd4j.graph.FlatNode.addExtraInteger = function(builder, extraIntegerOffset) { - builder.addFieldOffset(9, extraIntegerOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createExtraIntegerVector = function(builder, data) { - builder.startVector(8, data.length, 8); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt64(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startExtraIntegerVector = function(builder, numElems) { - builder.startVector(8, numElems, 8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} extraBoolsOffset - */ -nd4j.graph.FlatNode.addExtraBools = function(builder, extraBoolsOffset) { - builder.addFieldOffset(10, extraBoolsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createExtraBoolsVector = function(builder, data) { - builder.startVector(1, data.length, 1); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt8(+data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startExtraBoolsVector = function(builder, numElems) { - builder.startVector(1, numElems, 1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} dimensionsOffset - */ -nd4j.graph.FlatNode.addDimensions = function(builder, dimensionsOffset) { - builder.addFieldOffset(11, dimensionsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createDimensionsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt32(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startDimensionsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} device - */ -nd4j.graph.FlatNode.addDevice = function(builder, device) { - builder.addFieldInt32(12, device, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} scopeId - */ -nd4j.graph.FlatNode.addScopeId = function(builder, scopeId) { - builder.addFieldInt32(13, scopeId, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} scopeNameOffset - */ -nd4j.graph.FlatNode.addScopeName = function(builder, scopeNameOffset) { - builder.addFieldOffset(14, scopeNameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} outputNamesOffset - */ -nd4j.graph.FlatNode.addOutputNames = function(builder, outputNamesOffset) { - builder.addFieldOffset(15, outputNamesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createOutputNamesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startOutputNamesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} opNameOffset - */ -nd4j.graph.FlatNode.addOpName = function(builder, opNameOffset) { - builder.addFieldOffset(16, opNameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} outputTypesOffset - */ -nd4j.graph.FlatNode.addOutputTypes = function(builder, outputTypesOffset) { - builder.addFieldOffset(17, outputTypesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createOutputTypesVector = function(builder, data) { - builder.startVector(1, data.length, 1); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt8(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startOutputTypesVector = function(builder, numElems) { - builder.startVector(1, numElems, 1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} scalarOffset - */ -nd4j.graph.FlatNode.addScalar = function(builder, scalarOffset) { - builder.addFieldOffset(18, scalarOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepsOffset - */ -nd4j.graph.FlatNode.addControlDeps = function(builder, controlDepsOffset) { - builder.addFieldOffset(19, controlDepsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createControlDepsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startControlDepsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} varControlDepsOffset - */ -nd4j.graph.FlatNode.addVarControlDeps = function(builder, varControlDepsOffset) { - builder.addFieldOffset(20, varControlDepsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createVarControlDepsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startVarControlDepsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepForOffset - */ -nd4j.graph.FlatNode.addControlDepFor = function(builder, controlDepForOffset) { - builder.addFieldOffset(21, controlDepForOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createControlDepForVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startControlDepForVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} extraTypesOffset - */ -nd4j.graph.FlatNode.addExtraTypes = function(builder, extraTypesOffset) { - builder.addFieldOffset(22, extraTypesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.createExtraTypesVector = function(builder, data) { - builder.startVector(1, data.length, 1); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt8(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatNode.startExtraTypesVector = function(builder, numElems) { - builder.startVector(1, numElems, 1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatNode.endFlatNode = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} offset - */ -nd4j.graph.FlatNode.finishFlatNodeBuffer = function(builder, offset) { - builder.finish(offset); -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/properties_generated.h b/libnd4j/include/graph/generated/properties_generated.h deleted file mode 100644 index 34138fe86..000000000 --- a/libnd4j/include/graph/generated/properties_generated.h +++ /dev/null @@ -1,191 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_PROPERTIES_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_PROPERTIES_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -#include "array_generated.h" - -namespace sd { -namespace graph { - -struct FlatProperties; - -struct FlatProperties FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_NAME = 4, - VT_I = 6, - VT_L = 8, - VT_D = 10, - VT_A = 12, - VT_B = 14, - VT_S = 16, - VT_SHAPE = 18 - }; - const flatbuffers::String *name() const { - return GetPointer(VT_NAME); - } - const flatbuffers::Vector *i() const { - return GetPointer *>(VT_I); - } - const flatbuffers::Vector *l() const { - return GetPointer *>(VT_L); - } - const flatbuffers::Vector *d() const { - return GetPointer *>(VT_D); - } - const flatbuffers::Vector> *a() const { - return GetPointer> *>(VT_A); - } - const flatbuffers::Vector *b() const { - return GetPointer *>(VT_B); - } - const flatbuffers::Vector> *s() const { - return GetPointer> *>(VT_S); - } - const flatbuffers::Vector *shape() const { - return GetPointer *>(VT_SHAPE); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_NAME) && - verifier.VerifyString(name()) && - VerifyOffset(verifier, VT_I) && - verifier.VerifyVector(i()) && - VerifyOffset(verifier, VT_L) && - verifier.VerifyVector(l()) && - VerifyOffset(verifier, VT_D) && - verifier.VerifyVector(d()) && - VerifyOffset(verifier, VT_A) && - verifier.VerifyVector(a()) && - verifier.VerifyVectorOfTables(a()) && - VerifyOffset(verifier, VT_B) && - verifier.VerifyVector(b()) && - VerifyOffset(verifier, VT_S) && - verifier.VerifyVector(s()) && - verifier.VerifyVectorOfStrings(s()) && - VerifyOffset(verifier, VT_SHAPE) && - verifier.VerifyVector(shape()) && - verifier.EndTable(); - } -}; - -struct FlatPropertiesBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_name(flatbuffers::Offset name) { - fbb_.AddOffset(FlatProperties::VT_NAME, name); - } - void add_i(flatbuffers::Offset> i) { - fbb_.AddOffset(FlatProperties::VT_I, i); - } - void add_l(flatbuffers::Offset> l) { - fbb_.AddOffset(FlatProperties::VT_L, l); - } - void add_d(flatbuffers::Offset> d) { - fbb_.AddOffset(FlatProperties::VT_D, d); - } - void add_a(flatbuffers::Offset>> a) { - fbb_.AddOffset(FlatProperties::VT_A, a); - } - void add_b(flatbuffers::Offset> b) { - fbb_.AddOffset(FlatProperties::VT_B, b); - } - void add_s(flatbuffers::Offset>> s) { - fbb_.AddOffset(FlatProperties::VT_S, s); - } - void add_shape(flatbuffers::Offset> shape) { - fbb_.AddOffset(FlatProperties::VT_SHAPE, shape); - } - explicit FlatPropertiesBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatPropertiesBuilder &operator=(const FlatPropertiesBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatProperties( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset name = 0, - flatbuffers::Offset> i = 0, - flatbuffers::Offset> l = 0, - flatbuffers::Offset> d = 0, - flatbuffers::Offset>> a = 0, - flatbuffers::Offset> b = 0, - flatbuffers::Offset>> s = 0, - flatbuffers::Offset> shape = 0) { - FlatPropertiesBuilder builder_(_fbb); - builder_.add_shape(shape); - builder_.add_s(s); - builder_.add_b(b); - builder_.add_a(a); - builder_.add_d(d); - builder_.add_l(l); - builder_.add_i(i); - builder_.add_name(name); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatPropertiesDirect( - flatbuffers::FlatBufferBuilder &_fbb, - const char *name = nullptr, - const std::vector *i = nullptr, - const std::vector *l = nullptr, - const std::vector *d = nullptr, - const std::vector> *a = nullptr, - const std::vector *b = nullptr, - const std::vector> *s = nullptr, - const std::vector *shape = nullptr) { - return sd::graph::CreateFlatProperties( - _fbb, - name ? _fbb.CreateString(name) : 0, - i ? _fbb.CreateVector(*i) : 0, - l ? _fbb.CreateVector(*l) : 0, - d ? _fbb.CreateVector(*d) : 0, - a ? _fbb.CreateVector>(*a) : 0, - b ? _fbb.CreateVector(*b) : 0, - s ? _fbb.CreateVector>(*s) : 0, - shape ? _fbb.CreateVector(*shape) : 0); -} - -inline const sd::graph::FlatProperties *GetFlatProperties(const void *buf) { - return flatbuffers::GetRoot(buf); -} - -inline const sd::graph::FlatProperties *GetSizePrefixedFlatProperties(const void *buf) { - return flatbuffers::GetSizePrefixedRoot(buf); -} - -inline bool VerifyFlatPropertiesBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifyBuffer(nullptr); -} - -inline bool VerifySizePrefixedFlatPropertiesBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifySizePrefixedBuffer(nullptr); -} - -inline void FinishFlatPropertiesBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.Finish(root); -} - -inline void FinishSizePrefixedFlatPropertiesBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.FinishSizePrefixed(root); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_PROPERTIES_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/properties_generated.js b/libnd4j/include/graph/generated/properties_generated.js deleted file mode 100644 index e3e1426d4..000000000 --- a/libnd4j/include/graph/generated/properties_generated.js +++ /dev/null @@ -1,466 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @constructor - */ -nd4j.graph.FlatProperties = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatProperties} - */ -nd4j.graph.FlatProperties.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatProperties=} obj - * @returns {nd4j.graph.FlatProperties} - */ -nd4j.graph.FlatProperties.getRootAsFlatProperties = function(bb, obj) { - return (obj || new nd4j.graph.FlatProperties).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.FlatProperties.prototype.name = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {number} index - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.i = function(index) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readInt32(this.bb.__vector(this.bb_pos + offset) + index * 4) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.iLength = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int32Array} - */ -nd4j.graph.FlatProperties.prototype.iArray = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? new Int32Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {number} index - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatProperties.prototype.l = function(index) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.readInt64(this.bb.__vector(this.bb_pos + offset) + index * 8) : this.bb.createLong(0, 0); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.lLength = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.d = function(index) { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.readFloat64(this.bb.__vector(this.bb_pos + offset) + index * 8) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.dLength = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Float64Array} - */ -nd4j.graph.FlatProperties.prototype.dArray = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? new Float64Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray} - */ -nd4j.graph.FlatProperties.prototype.a = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.aLength = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @returns {boolean} - */ -nd4j.graph.FlatProperties.prototype.b = function(index) { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? !!this.bb.readInt8(this.bb.__vector(this.bb_pos + offset) + index) : false; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.bLength = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int8Array} - */ -nd4j.graph.FlatProperties.prototype.bArray = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? new Int8Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatProperties.prototype.s = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.sLength = function() { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.shape = function(index) { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.readInt32(this.bb.__vector(this.bb_pos + offset) + index * 4) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatProperties.prototype.shapeLength = function() { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {Int32Array} - */ -nd4j.graph.FlatProperties.prototype.shapeArray = function() { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? new Int32Array(this.bb.bytes().buffer, this.bb.bytes().byteOffset + this.bb.__vector(this.bb_pos + offset), this.bb.__vector_len(this.bb_pos + offset)) : null; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatProperties.startFlatProperties = function(builder) { - builder.startObject(8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} nameOffset - */ -nd4j.graph.FlatProperties.addName = function(builder, nameOffset) { - builder.addFieldOffset(0, nameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} iOffset - */ -nd4j.graph.FlatProperties.addI = function(builder, iOffset) { - builder.addFieldOffset(1, iOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatProperties.createIVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt32(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatProperties.startIVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} lOffset - */ -nd4j.graph.FlatProperties.addL = function(builder, lOffset) { - builder.addFieldOffset(2, lOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatProperties.createLVector = function(builder, data) { - builder.startVector(8, data.length, 8); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt64(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatProperties.startLVector = function(builder, numElems) { - builder.startVector(8, numElems, 8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} dOffset - */ -nd4j.graph.FlatProperties.addD = function(builder, dOffset) { - builder.addFieldOffset(3, dOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatProperties.createDVector = function(builder, data) { - builder.startVector(8, data.length, 8); - for (var i = data.length - 1; i >= 0; i--) { - builder.addFloat64(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatProperties.startDVector = function(builder, numElems) { - builder.startVector(8, numElems, 8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} aOffset - */ -nd4j.graph.FlatProperties.addA = function(builder, aOffset) { - builder.addFieldOffset(4, aOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatProperties.createAVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatProperties.startAVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} bOffset - */ -nd4j.graph.FlatProperties.addB = function(builder, bOffset) { - builder.addFieldOffset(5, bOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatProperties.createBVector = function(builder, data) { - builder.startVector(1, data.length, 1); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt8(+data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatProperties.startBVector = function(builder, numElems) { - builder.startVector(1, numElems, 1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} sOffset - */ -nd4j.graph.FlatProperties.addS = function(builder, sOffset) { - builder.addFieldOffset(6, sOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatProperties.createSVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatProperties.startSVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} shapeOffset - */ -nd4j.graph.FlatProperties.addShape = function(builder, shapeOffset) { - builder.addFieldOffset(7, shapeOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatProperties.createShapeVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt32(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatProperties.startShapeVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatProperties.endFlatProperties = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} offset - */ -nd4j.graph.FlatProperties.finishFlatPropertiesBuffer = function(builder, offset) { - builder.finish(offset); -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/request_generated.h b/libnd4j/include/graph/generated/request_generated.h deleted file mode 100644 index 00c782311..000000000 --- a/libnd4j/include/graph/generated/request_generated.h +++ /dev/null @@ -1,127 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_REQUEST_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_REQUEST_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -#include "array_generated.h" -#include "config_generated.h" -#include "utils_generated.h" -#include "variable_generated.h" - -namespace sd { -namespace graph { - -struct FlatInferenceRequest; - -struct FlatInferenceRequest FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4, - VT_VARIABLES = 6, - VT_CONFIGURATION = 8 - }; - int64_t id() const { - return GetField(VT_ID, 0); - } - const flatbuffers::Vector> *variables() const { - return GetPointer> *>(VT_VARIABLES); - } - const FlatConfiguration *configuration() const { - return GetPointer(VT_CONFIGURATION); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_ID) && - VerifyOffset(verifier, VT_VARIABLES) && - verifier.VerifyVector(variables()) && - verifier.VerifyVectorOfTables(variables()) && - VerifyOffset(verifier, VT_CONFIGURATION) && - verifier.VerifyTable(configuration()) && - verifier.EndTable(); - } -}; - -struct FlatInferenceRequestBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(int64_t id) { - fbb_.AddElement(FlatInferenceRequest::VT_ID, id, 0); - } - void add_variables(flatbuffers::Offset>> variables) { - fbb_.AddOffset(FlatInferenceRequest::VT_VARIABLES, variables); - } - void add_configuration(flatbuffers::Offset configuration) { - fbb_.AddOffset(FlatInferenceRequest::VT_CONFIGURATION, configuration); - } - explicit FlatInferenceRequestBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatInferenceRequestBuilder &operator=(const FlatInferenceRequestBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatInferenceRequest( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t id = 0, - flatbuffers::Offset>> variables = 0, - flatbuffers::Offset configuration = 0) { - FlatInferenceRequestBuilder builder_(_fbb); - builder_.add_id(id); - builder_.add_configuration(configuration); - builder_.add_variables(variables); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatInferenceRequestDirect( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t id = 0, - const std::vector> *variables = nullptr, - flatbuffers::Offset configuration = 0) { - return sd::graph::CreateFlatInferenceRequest( - _fbb, - id, - variables ? _fbb.CreateVector>(*variables) : 0, - configuration); -} - -inline const sd::graph::FlatInferenceRequest *GetFlatInferenceRequest(const void *buf) { - return flatbuffers::GetRoot(buf); -} - -inline const sd::graph::FlatInferenceRequest *GetSizePrefixedFlatInferenceRequest(const void *buf) { - return flatbuffers::GetSizePrefixedRoot(buf); -} - -inline bool VerifyFlatInferenceRequestBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifyBuffer(nullptr); -} - -inline bool VerifySizePrefixedFlatInferenceRequestBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifySizePrefixedBuffer(nullptr); -} - -inline void FinishFlatInferenceRequestBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.Finish(root); -} - -inline void FinishSizePrefixedFlatInferenceRequestBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.FinishSizePrefixed(root); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_REQUEST_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/request_generated.js b/libnd4j/include/graph/generated/request_generated.js deleted file mode 100644 index 98930a58f..000000000 --- a/libnd4j/include/graph/generated/request_generated.js +++ /dev/null @@ -1,173 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @constructor - */ -nd4j.graph.FlatInferenceRequest = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatInferenceRequest} - */ -nd4j.graph.FlatInferenceRequest.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatInferenceRequest=} obj - * @returns {nd4j.graph.FlatInferenceRequest} - */ -nd4j.graph.FlatInferenceRequest.getRootAsFlatInferenceRequest = function(bb, obj) { - return (obj || new nd4j.graph.FlatInferenceRequest).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatInferenceRequest.prototype.id = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatVariable=} obj - * @returns {nd4j.graph.FlatVariable} - */ -nd4j.graph.FlatInferenceRequest.prototype.variables = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? (obj || new nd4j.graph.FlatVariable).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatInferenceRequest.prototype.variablesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {nd4j.graph.FlatConfiguration=} obj - * @returns {nd4j.graph.FlatConfiguration|null} - */ -nd4j.graph.FlatInferenceRequest.prototype.configuration = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? (obj || new nd4j.graph.FlatConfiguration).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatInferenceRequest.startFlatInferenceRequest = function(builder) { - builder.startObject(3); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} id - */ -nd4j.graph.FlatInferenceRequest.addId = function(builder, id) { - builder.addFieldInt64(0, id, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} variablesOffset - */ -nd4j.graph.FlatInferenceRequest.addVariables = function(builder, variablesOffset) { - builder.addFieldOffset(1, variablesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatInferenceRequest.createVariablesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatInferenceRequest.startVariablesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} configurationOffset - */ -nd4j.graph.FlatInferenceRequest.addConfiguration = function(builder, configurationOffset) { - builder.addFieldOffset(2, configurationOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatInferenceRequest.endFlatInferenceRequest = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} offset - */ -nd4j.graph.FlatInferenceRequest.finishFlatInferenceRequestBuffer = function(builder, offset) { - builder.finish(offset); -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/result_generated.h b/libnd4j/include/graph/generated/result_generated.h deleted file mode 100644 index 04c458a9f..000000000 --- a/libnd4j/include/graph/generated/result_generated.h +++ /dev/null @@ -1,229 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_RESULT_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_RESULT_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -#include "array_generated.h" -#include "node_generated.h" -#include "properties_generated.h" -#include "utils_generated.h" -#include "variable_generated.h" - -namespace sd { -namespace graph { - -struct FlatTiming; - -struct FlatResult; - -struct FlatTiming FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4, - VT_NAME = 6, - VT_TIMING = 8 - }; - int32_t id() const { - return GetField(VT_ID, 0); - } - const flatbuffers::String *name() const { - return GetPointer(VT_NAME); - } - const LongPair *timing() const { - return GetPointer(VT_TIMING); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_ID) && - VerifyOffset(verifier, VT_NAME) && - verifier.VerifyString(name()) && - VerifyOffset(verifier, VT_TIMING) && - verifier.VerifyTable(timing()) && - verifier.EndTable(); - } -}; - -struct FlatTimingBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(int32_t id) { - fbb_.AddElement(FlatTiming::VT_ID, id, 0); - } - void add_name(flatbuffers::Offset name) { - fbb_.AddOffset(FlatTiming::VT_NAME, name); - } - void add_timing(flatbuffers::Offset timing) { - fbb_.AddOffset(FlatTiming::VT_TIMING, timing); - } - explicit FlatTimingBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatTimingBuilder &operator=(const FlatTimingBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatTiming( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t id = 0, - flatbuffers::Offset name = 0, - flatbuffers::Offset timing = 0) { - FlatTimingBuilder builder_(_fbb); - builder_.add_timing(timing); - builder_.add_name(name); - builder_.add_id(id); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatTimingDirect( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t id = 0, - const char *name = nullptr, - flatbuffers::Offset timing = 0) { - return sd::graph::CreateFlatTiming( - _fbb, - id, - name ? _fbb.CreateString(name) : 0, - timing); -} - -struct FlatResult FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4, - VT_VARIABLES = 6, - VT_TIMING = 8, - VT_FOOTPRINTFORWARD = 10, - VT_FOOTPRINTBACKWARD = 12 - }; - int64_t id() const { - return GetField(VT_ID, 0); - } - const flatbuffers::Vector> *variables() const { - return GetPointer> *>(VT_VARIABLES); - } - const flatbuffers::Vector> *timing() const { - return GetPointer> *>(VT_TIMING); - } - int64_t footprintForward() const { - return GetField(VT_FOOTPRINTFORWARD, 0); - } - int64_t footprintBackward() const { - return GetField(VT_FOOTPRINTBACKWARD, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_ID) && - VerifyOffset(verifier, VT_VARIABLES) && - verifier.VerifyVector(variables()) && - verifier.VerifyVectorOfTables(variables()) && - VerifyOffset(verifier, VT_TIMING) && - verifier.VerifyVector(timing()) && - verifier.VerifyVectorOfTables(timing()) && - VerifyField(verifier, VT_FOOTPRINTFORWARD) && - VerifyField(verifier, VT_FOOTPRINTBACKWARD) && - verifier.EndTable(); - } -}; - -struct FlatResultBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(int64_t id) { - fbb_.AddElement(FlatResult::VT_ID, id, 0); - } - void add_variables(flatbuffers::Offset>> variables) { - fbb_.AddOffset(FlatResult::VT_VARIABLES, variables); - } - void add_timing(flatbuffers::Offset>> timing) { - fbb_.AddOffset(FlatResult::VT_TIMING, timing); - } - void add_footprintForward(int64_t footprintForward) { - fbb_.AddElement(FlatResult::VT_FOOTPRINTFORWARD, footprintForward, 0); - } - void add_footprintBackward(int64_t footprintBackward) { - fbb_.AddElement(FlatResult::VT_FOOTPRINTBACKWARD, footprintBackward, 0); - } - explicit FlatResultBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatResultBuilder &operator=(const FlatResultBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatResult( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t id = 0, - flatbuffers::Offset>> variables = 0, - flatbuffers::Offset>> timing = 0, - int64_t footprintForward = 0, - int64_t footprintBackward = 0) { - FlatResultBuilder builder_(_fbb); - builder_.add_footprintBackward(footprintBackward); - builder_.add_footprintForward(footprintForward); - builder_.add_id(id); - builder_.add_timing(timing); - builder_.add_variables(variables); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatResultDirect( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t id = 0, - const std::vector> *variables = nullptr, - const std::vector> *timing = nullptr, - int64_t footprintForward = 0, - int64_t footprintBackward = 0) { - return sd::graph::CreateFlatResult( - _fbb, - id, - variables ? _fbb.CreateVector>(*variables) : 0, - timing ? _fbb.CreateVector>(*timing) : 0, - footprintForward, - footprintBackward); -} - -inline const sd::graph::FlatResult *GetFlatResult(const void *buf) { - return flatbuffers::GetRoot(buf); -} - -inline const sd::graph::FlatResult *GetSizePrefixedFlatResult(const void *buf) { - return flatbuffers::GetSizePrefixedRoot(buf); -} - -inline bool VerifyFlatResultBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifyBuffer(nullptr); -} - -inline bool VerifySizePrefixedFlatResultBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifySizePrefixedBuffer(nullptr); -} - -inline void FinishFlatResultBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.Finish(root); -} - -inline void FinishSizePrefixedFlatResultBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.FinishSizePrefixed(root); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_RESULT_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/result_generated.js b/libnd4j/include/graph/generated/result_generated.js deleted file mode 100644 index 4cf13ebd8..000000000 --- a/libnd4j/include/graph/generated/result_generated.js +++ /dev/null @@ -1,336 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @constructor - */ -nd4j.graph.FlatTiming = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatTiming} - */ -nd4j.graph.FlatTiming.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatTiming=} obj - * @returns {nd4j.graph.FlatTiming} - */ -nd4j.graph.FlatTiming.getRootAsFlatTiming = function(bb, obj) { - return (obj || new nd4j.graph.FlatTiming).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatTiming.prototype.id = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.FlatTiming.prototype.name = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {nd4j.graph.LongPair=} obj - * @returns {nd4j.graph.LongPair|null} - */ -nd4j.graph.FlatTiming.prototype.timing = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? (obj || new nd4j.graph.LongPair).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatTiming.startFlatTiming = function(builder) { - builder.startObject(3); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} id - */ -nd4j.graph.FlatTiming.addId = function(builder, id) { - builder.addFieldInt32(0, id, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} nameOffset - */ -nd4j.graph.FlatTiming.addName = function(builder, nameOffset) { - builder.addFieldOffset(1, nameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} timingOffset - */ -nd4j.graph.FlatTiming.addTiming = function(builder, timingOffset) { - builder.addFieldOffset(2, timingOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatTiming.endFlatTiming = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.FlatResult = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatResult} - */ -nd4j.graph.FlatResult.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatResult=} obj - * @returns {nd4j.graph.FlatResult} - */ -nd4j.graph.FlatResult.getRootAsFlatResult = function(bb, obj) { - return (obj || new nd4j.graph.FlatResult).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatResult.prototype.id = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatVariable=} obj - * @returns {nd4j.graph.FlatVariable} - */ -nd4j.graph.FlatResult.prototype.variables = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? (obj || new nd4j.graph.FlatVariable).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatResult.prototype.variablesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatTiming=} obj - * @returns {nd4j.graph.FlatTiming} - */ -nd4j.graph.FlatResult.prototype.timing = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? (obj || new nd4j.graph.FlatTiming).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatResult.prototype.timingLength = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatResult.prototype.footprintForward = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatResult.prototype.footprintBackward = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatResult.startFlatResult = function(builder) { - builder.startObject(5); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} id - */ -nd4j.graph.FlatResult.addId = function(builder, id) { - builder.addFieldInt64(0, id, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} variablesOffset - */ -nd4j.graph.FlatResult.addVariables = function(builder, variablesOffset) { - builder.addFieldOffset(1, variablesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatResult.createVariablesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatResult.startVariablesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} timingOffset - */ -nd4j.graph.FlatResult.addTiming = function(builder, timingOffset) { - builder.addFieldOffset(2, timingOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatResult.createTimingVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatResult.startTimingVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} footprintForward - */ -nd4j.graph.FlatResult.addFootprintForward = function(builder, footprintForward) { - builder.addFieldInt64(3, footprintForward, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} footprintBackward - */ -nd4j.graph.FlatResult.addFootprintBackward = function(builder, footprintBackward) { - builder.addFieldInt64(4, footprintBackward, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatResult.endFlatResult = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} offset - */ -nd4j.graph.FlatResult.finishFlatResultBuffer = function(builder, offset) { - builder.finish(offset); -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/uigraphevents_generated.h b/libnd4j/include/graph/generated/uigraphevents_generated.h deleted file mode 100644 index b3430a5c7..000000000 --- a/libnd4j/include/graph/generated/uigraphevents_generated.h +++ /dev/null @@ -1,752 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_UIGRAPHEVENTS_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_UIGRAPHEVENTS_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -#include "array_generated.h" - -namespace sd { -namespace graph { - -struct UIEvent; - -struct FrameIteration; - -struct UIAddName; - -struct FlatArrayList; - -struct UIHistogram; - -struct UISummaryStatistics; - -struct UIHardwareState; - -enum UIEventType { - UIEventType_ADD_NAME = 0, - UIEventType_SCALAR = 1, - UIEventType_ARRAY = 2, - UIEventType_ARRAY_LIST = 3, - UIEventType_HISTOGRAM = 4, - UIEventType_IMAGE = 5, - UIEventType_SUMMARY_STATISTICS = 6, - UIEventType_OP_TIMING = 7, - UIEventType_HARDWARE_STATE = 8, - UIEventType_MIN = UIEventType_ADD_NAME, - UIEventType_MAX = UIEventType_HARDWARE_STATE -}; - -inline const UIEventType (&EnumValuesUIEventType())[9] { - static const UIEventType values[] = { - UIEventType_ADD_NAME, - UIEventType_SCALAR, - UIEventType_ARRAY, - UIEventType_ARRAY_LIST, - UIEventType_HISTOGRAM, - UIEventType_IMAGE, - UIEventType_SUMMARY_STATISTICS, - UIEventType_OP_TIMING, - UIEventType_HARDWARE_STATE - }; - return values; -} - -inline const char * const *EnumNamesUIEventType() { - static const char * const names[] = { - "ADD_NAME", - "SCALAR", - "ARRAY", - "ARRAY_LIST", - "HISTOGRAM", - "IMAGE", - "SUMMARY_STATISTICS", - "OP_TIMING", - "HARDWARE_STATE", - nullptr - }; - return names; -} - -inline const char *EnumNameUIEventType(UIEventType e) { - const size_t index = static_cast(e); - return EnumNamesUIEventType()[index]; -} - -enum UIEventSubtype { - UIEventSubtype_NONE = 0, - UIEventSubtype_EVALUATION = 1, - UIEventSubtype_LOSS = 2, - UIEventSubtype_LEARNING_RATE = 3, - UIEventSubtype_TUNING_METRIC = 4, - UIEventSubtype_PERFORMANCE = 5, - UIEventSubtype_PROFILING = 6, - UIEventSubtype_FEATURE_LABEL = 7, - UIEventSubtype_PREDICTION = 8, - UIEventSubtype_USER_CUSTOM = 9, - UIEventSubtype_MIN = UIEventSubtype_NONE, - UIEventSubtype_MAX = UIEventSubtype_USER_CUSTOM -}; - -inline const UIEventSubtype (&EnumValuesUIEventSubtype())[10] { - static const UIEventSubtype values[] = { - UIEventSubtype_NONE, - UIEventSubtype_EVALUATION, - UIEventSubtype_LOSS, - UIEventSubtype_LEARNING_RATE, - UIEventSubtype_TUNING_METRIC, - UIEventSubtype_PERFORMANCE, - UIEventSubtype_PROFILING, - UIEventSubtype_FEATURE_LABEL, - UIEventSubtype_PREDICTION, - UIEventSubtype_USER_CUSTOM - }; - return values; -} - -inline const char * const *EnumNamesUIEventSubtype() { - static const char * const names[] = { - "NONE", - "EVALUATION", - "LOSS", - "LEARNING_RATE", - "TUNING_METRIC", - "PERFORMANCE", - "PROFILING", - "FEATURE_LABEL", - "PREDICTION", - "USER_CUSTOM", - nullptr - }; - return names; -} - -inline const char *EnumNameUIEventSubtype(UIEventSubtype e) { - const size_t index = static_cast(e); - return EnumNamesUIEventSubtype()[index]; -} - -enum UIHistogramType { - UIHistogramType_DISCRETE = 0, - UIHistogramType_EQUAL_SPACING = 1, - UIHistogramType_CUSTOM = 2, - UIHistogramType_MIN = UIHistogramType_DISCRETE, - UIHistogramType_MAX = UIHistogramType_CUSTOM -}; - -inline const UIHistogramType (&EnumValuesUIHistogramType())[3] { - static const UIHistogramType values[] = { - UIHistogramType_DISCRETE, - UIHistogramType_EQUAL_SPACING, - UIHistogramType_CUSTOM - }; - return values; -} - -inline const char * const *EnumNamesUIHistogramType() { - static const char * const names[] = { - "DISCRETE", - "EQUAL_SPACING", - "CUSTOM", - nullptr - }; - return names; -} - -inline const char *EnumNameUIHistogramType(UIHistogramType e) { - const size_t index = static_cast(e); - return EnumNamesUIHistogramType()[index]; -} - -struct UIEvent FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_EVENTTYPE = 4, - VT_EVENTSUBTYPE = 6, - VT_NAMEIDX = 8, - VT_TIMESTAMP = 10, - VT_ITERATION = 12, - VT_EPOCH = 14, - VT_VARIABLEID = 16, - VT_FRAMEITER = 18, - VT_PLUGIN = 20 - }; - UIEventType eventType() const { - return static_cast(GetField(VT_EVENTTYPE, 0)); - } - UIEventSubtype eventSubType() const { - return static_cast(GetField(VT_EVENTSUBTYPE, 0)); - } - int32_t nameIdx() const { - return GetField(VT_NAMEIDX, 0); - } - int64_t timestamp() const { - return GetField(VT_TIMESTAMP, 0); - } - int32_t iteration() const { - return GetField(VT_ITERATION, 0); - } - int32_t epoch() const { - return GetField(VT_EPOCH, 0); - } - int16_t variableId() const { - return GetField(VT_VARIABLEID, 0); - } - const FrameIteration *frameIter() const { - return GetPointer(VT_FRAMEITER); - } - uint16_t plugin() const { - return GetField(VT_PLUGIN, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_EVENTTYPE) && - VerifyField(verifier, VT_EVENTSUBTYPE) && - VerifyField(verifier, VT_NAMEIDX) && - VerifyField(verifier, VT_TIMESTAMP) && - VerifyField(verifier, VT_ITERATION) && - VerifyField(verifier, VT_EPOCH) && - VerifyField(verifier, VT_VARIABLEID) && - VerifyOffset(verifier, VT_FRAMEITER) && - verifier.VerifyTable(frameIter()) && - VerifyField(verifier, VT_PLUGIN) && - verifier.EndTable(); - } -}; - -struct UIEventBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_eventType(UIEventType eventType) { - fbb_.AddElement(UIEvent::VT_EVENTTYPE, static_cast(eventType), 0); - } - void add_eventSubType(UIEventSubtype eventSubType) { - fbb_.AddElement(UIEvent::VT_EVENTSUBTYPE, static_cast(eventSubType), 0); - } - void add_nameIdx(int32_t nameIdx) { - fbb_.AddElement(UIEvent::VT_NAMEIDX, nameIdx, 0); - } - void add_timestamp(int64_t timestamp) { - fbb_.AddElement(UIEvent::VT_TIMESTAMP, timestamp, 0); - } - void add_iteration(int32_t iteration) { - fbb_.AddElement(UIEvent::VT_ITERATION, iteration, 0); - } - void add_epoch(int32_t epoch) { - fbb_.AddElement(UIEvent::VT_EPOCH, epoch, 0); - } - void add_variableId(int16_t variableId) { - fbb_.AddElement(UIEvent::VT_VARIABLEID, variableId, 0); - } - void add_frameIter(flatbuffers::Offset frameIter) { - fbb_.AddOffset(UIEvent::VT_FRAMEITER, frameIter); - } - void add_plugin(uint16_t plugin) { - fbb_.AddElement(UIEvent::VT_PLUGIN, plugin, 0); - } - explicit UIEventBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UIEventBuilder &operator=(const UIEventBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUIEvent( - flatbuffers::FlatBufferBuilder &_fbb, - UIEventType eventType = UIEventType_ADD_NAME, - UIEventSubtype eventSubType = UIEventSubtype_NONE, - int32_t nameIdx = 0, - int64_t timestamp = 0, - int32_t iteration = 0, - int32_t epoch = 0, - int16_t variableId = 0, - flatbuffers::Offset frameIter = 0, - uint16_t plugin = 0) { - UIEventBuilder builder_(_fbb); - builder_.add_timestamp(timestamp); - builder_.add_frameIter(frameIter); - builder_.add_epoch(epoch); - builder_.add_iteration(iteration); - builder_.add_nameIdx(nameIdx); - builder_.add_plugin(plugin); - builder_.add_variableId(variableId); - builder_.add_eventSubType(eventSubType); - builder_.add_eventType(eventType); - return builder_.Finish(); -} - -struct FrameIteration FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_FRAME = 4, - VT_ITERATION = 6 - }; - const flatbuffers::String *frame() const { - return GetPointer(VT_FRAME); - } - uint16_t iteration() const { - return GetField(VT_ITERATION, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_FRAME) && - verifier.VerifyString(frame()) && - VerifyField(verifier, VT_ITERATION) && - verifier.EndTable(); - } -}; - -struct FrameIterationBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_frame(flatbuffers::Offset frame) { - fbb_.AddOffset(FrameIteration::VT_FRAME, frame); - } - void add_iteration(uint16_t iteration) { - fbb_.AddElement(FrameIteration::VT_ITERATION, iteration, 0); - } - explicit FrameIterationBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FrameIterationBuilder &operator=(const FrameIterationBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFrameIteration( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset frame = 0, - uint16_t iteration = 0) { - FrameIterationBuilder builder_(_fbb); - builder_.add_frame(frame); - builder_.add_iteration(iteration); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFrameIterationDirect( - flatbuffers::FlatBufferBuilder &_fbb, - const char *frame = nullptr, - uint16_t iteration = 0) { - return sd::graph::CreateFrameIteration( - _fbb, - frame ? _fbb.CreateString(frame) : 0, - iteration); -} - -struct UIAddName FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_NAMEIDX = 4, - VT_NAME = 6 - }; - int32_t nameIdx() const { - return GetField(VT_NAMEIDX, 0); - } - const flatbuffers::String *name() const { - return GetPointer(VT_NAME); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_NAMEIDX) && - VerifyOffset(verifier, VT_NAME) && - verifier.VerifyString(name()) && - verifier.EndTable(); - } -}; - -struct UIAddNameBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_nameIdx(int32_t nameIdx) { - fbb_.AddElement(UIAddName::VT_NAMEIDX, nameIdx, 0); - } - void add_name(flatbuffers::Offset name) { - fbb_.AddOffset(UIAddName::VT_NAME, name); - } - explicit UIAddNameBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UIAddNameBuilder &operator=(const UIAddNameBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUIAddName( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t nameIdx = 0, - flatbuffers::Offset name = 0) { - UIAddNameBuilder builder_(_fbb); - builder_.add_name(name); - builder_.add_nameIdx(nameIdx); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateUIAddNameDirect( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t nameIdx = 0, - const char *name = nullptr) { - return sd::graph::CreateUIAddName( - _fbb, - nameIdx, - name ? _fbb.CreateString(name) : 0); -} - -struct FlatArrayList FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_LIST = 4 - }; - const flatbuffers::Vector> *list() const { - return GetPointer> *>(VT_LIST); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_LIST) && - verifier.VerifyVector(list()) && - verifier.VerifyVectorOfTables(list()) && - verifier.EndTable(); - } -}; - -struct FlatArrayListBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_list(flatbuffers::Offset>> list) { - fbb_.AddOffset(FlatArrayList::VT_LIST, list); - } - explicit FlatArrayListBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatArrayListBuilder &operator=(const FlatArrayListBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatArrayList( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset>> list = 0) { - FlatArrayListBuilder builder_(_fbb); - builder_.add_list(list); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatArrayListDirect( - flatbuffers::FlatBufferBuilder &_fbb, - const std::vector> *list = nullptr) { - return sd::graph::CreateFlatArrayList( - _fbb, - list ? _fbb.CreateVector>(*list) : 0); -} - -struct UIHistogram FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_TYPE = 4, - VT_NUMBINS = 6, - VT_BINRANGES = 8, - VT_Y = 10, - VT_BINLABELS = 12 - }; - UIHistogramType type() const { - return static_cast(GetField(VT_TYPE, 0)); - } - uint32_t numbins() const { - return GetField(VT_NUMBINS, 0); - } - const FlatArray *binranges() const { - return GetPointer(VT_BINRANGES); - } - const FlatArray *y() const { - return GetPointer(VT_Y); - } - const flatbuffers::Vector> *binlabels() const { - return GetPointer> *>(VT_BINLABELS); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_TYPE) && - VerifyField(verifier, VT_NUMBINS) && - VerifyOffset(verifier, VT_BINRANGES) && - verifier.VerifyTable(binranges()) && - VerifyOffset(verifier, VT_Y) && - verifier.VerifyTable(y()) && - VerifyOffset(verifier, VT_BINLABELS) && - verifier.VerifyVector(binlabels()) && - verifier.VerifyVectorOfStrings(binlabels()) && - verifier.EndTable(); - } -}; - -struct UIHistogramBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_type(UIHistogramType type) { - fbb_.AddElement(UIHistogram::VT_TYPE, static_cast(type), 0); - } - void add_numbins(uint32_t numbins) { - fbb_.AddElement(UIHistogram::VT_NUMBINS, numbins, 0); - } - void add_binranges(flatbuffers::Offset binranges) { - fbb_.AddOffset(UIHistogram::VT_BINRANGES, binranges); - } - void add_y(flatbuffers::Offset y) { - fbb_.AddOffset(UIHistogram::VT_Y, y); - } - void add_binlabels(flatbuffers::Offset>> binlabels) { - fbb_.AddOffset(UIHistogram::VT_BINLABELS, binlabels); - } - explicit UIHistogramBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UIHistogramBuilder &operator=(const UIHistogramBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUIHistogram( - flatbuffers::FlatBufferBuilder &_fbb, - UIHistogramType type = UIHistogramType_DISCRETE, - uint32_t numbins = 0, - flatbuffers::Offset binranges = 0, - flatbuffers::Offset y = 0, - flatbuffers::Offset>> binlabels = 0) { - UIHistogramBuilder builder_(_fbb); - builder_.add_binlabels(binlabels); - builder_.add_y(y); - builder_.add_binranges(binranges); - builder_.add_numbins(numbins); - builder_.add_type(type); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateUIHistogramDirect( - flatbuffers::FlatBufferBuilder &_fbb, - UIHistogramType type = UIHistogramType_DISCRETE, - uint32_t numbins = 0, - flatbuffers::Offset binranges = 0, - flatbuffers::Offset y = 0, - const std::vector> *binlabels = nullptr) { - return sd::graph::CreateUIHistogram( - _fbb, - type, - numbins, - binranges, - y, - binlabels ? _fbb.CreateVector>(*binlabels) : 0); -} - -struct UISummaryStatistics FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_BITMASK = 4, - VT_MIN = 6, - VT_MAX = 8, - VT_MEAN = 10, - VT_STDEV = 12, - VT_COUNTZERO = 14, - VT_COUNTPOSITIVE = 16, - VT_COUNTNEGATIVE = 18, - VT_COUNTNAN = 20, - VT_COUNTINF = 22 - }; - uint32_t bitmask() const { - return GetField(VT_BITMASK, 0); - } - const FlatArray *min() const { - return GetPointer(VT_MIN); - } - const FlatArray *max() const { - return GetPointer(VT_MAX); - } - double mean() const { - return GetField(VT_MEAN, 0.0); - } - double stdev() const { - return GetField(VT_STDEV, 0.0); - } - int64_t countzero() const { - return GetField(VT_COUNTZERO, 0); - } - int64_t countpositive() const { - return GetField(VT_COUNTPOSITIVE, 0); - } - int64_t countnegative() const { - return GetField(VT_COUNTNEGATIVE, 0); - } - int64_t countnan() const { - return GetField(VT_COUNTNAN, 0); - } - int64_t countinf() const { - return GetField(VT_COUNTINF, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_BITMASK) && - VerifyOffset(verifier, VT_MIN) && - verifier.VerifyTable(min()) && - VerifyOffset(verifier, VT_MAX) && - verifier.VerifyTable(max()) && - VerifyField(verifier, VT_MEAN) && - VerifyField(verifier, VT_STDEV) && - VerifyField(verifier, VT_COUNTZERO) && - VerifyField(verifier, VT_COUNTPOSITIVE) && - VerifyField(verifier, VT_COUNTNEGATIVE) && - VerifyField(verifier, VT_COUNTNAN) && - VerifyField(verifier, VT_COUNTINF) && - verifier.EndTable(); - } -}; - -struct UISummaryStatisticsBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_bitmask(uint32_t bitmask) { - fbb_.AddElement(UISummaryStatistics::VT_BITMASK, bitmask, 0); - } - void add_min(flatbuffers::Offset min) { - fbb_.AddOffset(UISummaryStatistics::VT_MIN, min); - } - void add_max(flatbuffers::Offset max) { - fbb_.AddOffset(UISummaryStatistics::VT_MAX, max); - } - void add_mean(double mean) { - fbb_.AddElement(UISummaryStatistics::VT_MEAN, mean, 0.0); - } - void add_stdev(double stdev) { - fbb_.AddElement(UISummaryStatistics::VT_STDEV, stdev, 0.0); - } - void add_countzero(int64_t countzero) { - fbb_.AddElement(UISummaryStatistics::VT_COUNTZERO, countzero, 0); - } - void add_countpositive(int64_t countpositive) { - fbb_.AddElement(UISummaryStatistics::VT_COUNTPOSITIVE, countpositive, 0); - } - void add_countnegative(int64_t countnegative) { - fbb_.AddElement(UISummaryStatistics::VT_COUNTNEGATIVE, countnegative, 0); - } - void add_countnan(int64_t countnan) { - fbb_.AddElement(UISummaryStatistics::VT_COUNTNAN, countnan, 0); - } - void add_countinf(int64_t countinf) { - fbb_.AddElement(UISummaryStatistics::VT_COUNTINF, countinf, 0); - } - explicit UISummaryStatisticsBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UISummaryStatisticsBuilder &operator=(const UISummaryStatisticsBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUISummaryStatistics( - flatbuffers::FlatBufferBuilder &_fbb, - uint32_t bitmask = 0, - flatbuffers::Offset min = 0, - flatbuffers::Offset max = 0, - double mean = 0.0, - double stdev = 0.0, - int64_t countzero = 0, - int64_t countpositive = 0, - int64_t countnegative = 0, - int64_t countnan = 0, - int64_t countinf = 0) { - UISummaryStatisticsBuilder builder_(_fbb); - builder_.add_countinf(countinf); - builder_.add_countnan(countnan); - builder_.add_countnegative(countnegative); - builder_.add_countpositive(countpositive); - builder_.add_countzero(countzero); - builder_.add_stdev(stdev); - builder_.add_mean(mean); - builder_.add_max(max); - builder_.add_min(min); - builder_.add_bitmask(bitmask); - return builder_.Finish(); -} - -struct UIHardwareState FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_GPUMEMORY = 4, - VT_HOSTMEMORY = 6 - }; - const flatbuffers::Vector *gpuMemory() const { - return GetPointer *>(VT_GPUMEMORY); - } - int64_t hostMemory() const { - return GetField(VT_HOSTMEMORY, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_GPUMEMORY) && - verifier.VerifyVector(gpuMemory()) && - VerifyField(verifier, VT_HOSTMEMORY) && - verifier.EndTable(); - } -}; - -struct UIHardwareStateBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_gpuMemory(flatbuffers::Offset> gpuMemory) { - fbb_.AddOffset(UIHardwareState::VT_GPUMEMORY, gpuMemory); - } - void add_hostMemory(int64_t hostMemory) { - fbb_.AddElement(UIHardwareState::VT_HOSTMEMORY, hostMemory, 0); - } - explicit UIHardwareStateBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UIHardwareStateBuilder &operator=(const UIHardwareStateBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUIHardwareState( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset> gpuMemory = 0, - int64_t hostMemory = 0) { - UIHardwareStateBuilder builder_(_fbb); - builder_.add_hostMemory(hostMemory); - builder_.add_gpuMemory(gpuMemory); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateUIHardwareStateDirect( - flatbuffers::FlatBufferBuilder &_fbb, - const std::vector *gpuMemory = nullptr, - int64_t hostMemory = 0) { - return sd::graph::CreateUIHardwareState( - _fbb, - gpuMemory ? _fbb.CreateVector(*gpuMemory) : 0, - hostMemory); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_UIGRAPHEVENTS_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/uigraphevents_generated.js b/libnd4j/include/graph/generated/uigraphevents_generated.js deleted file mode 100644 index f6a7f4117..000000000 --- a/libnd4j/include/graph/generated/uigraphevents_generated.js +++ /dev/null @@ -1,1026 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @enum - */ -nd4j.graph.UIEventType = { - ADD_NAME: 0, - SCALAR: 1, - ARRAY: 2, - ARRAY_LIST: 3, - HISTOGRAM: 4, - IMAGE: 5, - SUMMARY_STATISTICS: 6, - OP_TIMING: 7, - HARDWARE_STATE: 8 -}; - -/** - * @enum - */ -nd4j.graph.UIEventSubtype = { - NONE: 0, - EVALUATION: 1, - LOSS: 2, - LEARNING_RATE: 3, - TUNING_METRIC: 4, - PERFORMANCE: 5, - PROFILING: 6, - FEATURE_LABEL: 7, - PREDICTION: 8, - USER_CUSTOM: 9 -}; - -/** - * @enum - */ -nd4j.graph.UIHistogramType = { - DISCRETE: 0, - EQUAL_SPACING: 1, - CUSTOM: 2 -}; - -/** - * @constructor - */ -nd4j.graph.UIEvent = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UIEvent} - */ -nd4j.graph.UIEvent.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UIEvent=} obj - * @returns {nd4j.graph.UIEvent} - */ -nd4j.graph.UIEvent.getRootAsUIEvent = function(bb, obj) { - return (obj || new nd4j.graph.UIEvent).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {nd4j.graph.UIEventType} - */ -nd4j.graph.UIEvent.prototype.eventType = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? /** @type {nd4j.graph.UIEventType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.UIEventType.ADD_NAME; -}; - -/** - * @returns {nd4j.graph.UIEventSubtype} - */ -nd4j.graph.UIEvent.prototype.eventSubType = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? /** @type {nd4j.graph.UIEventSubtype} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.UIEventSubtype.NONE; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIEvent.prototype.nameIdx = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.UIEvent.prototype.timestamp = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {number} - */ -nd4j.graph.UIEvent.prototype.iteration = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIEvent.prototype.epoch = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIEvent.prototype.variableId = function() { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.readInt16(this.bb_pos + offset) : 0; -}; - -/** - * @param {nd4j.graph.FrameIteration=} obj - * @returns {nd4j.graph.FrameIteration|null} - */ -nd4j.graph.UIEvent.prototype.frameIter = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? (obj || new nd4j.graph.FrameIteration).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIEvent.prototype.plugin = function() { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.readUint16(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UIEvent.startUIEvent = function(builder) { - builder.startObject(9); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.UIEventType} eventType - */ -nd4j.graph.UIEvent.addEventType = function(builder, eventType) { - builder.addFieldInt8(0, eventType, nd4j.graph.UIEventType.ADD_NAME); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.UIEventSubtype} eventSubType - */ -nd4j.graph.UIEvent.addEventSubType = function(builder, eventSubType) { - builder.addFieldInt8(1, eventSubType, nd4j.graph.UIEventSubtype.NONE); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} nameIdx - */ -nd4j.graph.UIEvent.addNameIdx = function(builder, nameIdx) { - builder.addFieldInt32(2, nameIdx, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} timestamp - */ -nd4j.graph.UIEvent.addTimestamp = function(builder, timestamp) { - builder.addFieldInt64(3, timestamp, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} iteration - */ -nd4j.graph.UIEvent.addIteration = function(builder, iteration) { - builder.addFieldInt32(4, iteration, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} epoch - */ -nd4j.graph.UIEvent.addEpoch = function(builder, epoch) { - builder.addFieldInt32(5, epoch, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} variableId - */ -nd4j.graph.UIEvent.addVariableId = function(builder, variableId) { - builder.addFieldInt16(6, variableId, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} frameIterOffset - */ -nd4j.graph.UIEvent.addFrameIter = function(builder, frameIterOffset) { - builder.addFieldOffset(7, frameIterOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} plugin - */ -nd4j.graph.UIEvent.addPlugin = function(builder, plugin) { - builder.addFieldInt16(8, plugin, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIEvent.endUIEvent = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.FrameIteration = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FrameIteration} - */ -nd4j.graph.FrameIteration.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FrameIteration=} obj - * @returns {nd4j.graph.FrameIteration} - */ -nd4j.graph.FrameIteration.getRootAsFrameIteration = function(bb, obj) { - return (obj || new nd4j.graph.FrameIteration).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.FrameIteration.prototype.frame = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FrameIteration.prototype.iteration = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readUint16(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FrameIteration.startFrameIteration = function(builder) { - builder.startObject(2); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} frameOffset - */ -nd4j.graph.FrameIteration.addFrame = function(builder, frameOffset) { - builder.addFieldOffset(0, frameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} iteration - */ -nd4j.graph.FrameIteration.addIteration = function(builder, iteration) { - builder.addFieldInt16(1, iteration, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FrameIteration.endFrameIteration = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.UIAddName = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UIAddName} - */ -nd4j.graph.UIAddName.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UIAddName=} obj - * @returns {nd4j.graph.UIAddName} - */ -nd4j.graph.UIAddName.getRootAsUIAddName = function(bb, obj) { - return (obj || new nd4j.graph.UIAddName).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {number} - */ -nd4j.graph.UIAddName.prototype.nameIdx = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UIAddName.prototype.name = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UIAddName.startUIAddName = function(builder) { - builder.startObject(2); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} nameIdx - */ -nd4j.graph.UIAddName.addNameIdx = function(builder, nameIdx) { - builder.addFieldInt32(0, nameIdx, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} nameOffset - */ -nd4j.graph.UIAddName.addName = function(builder, nameOffset) { - builder.addFieldOffset(1, nameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIAddName.endUIAddName = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.FlatArrayList = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatArrayList} - */ -nd4j.graph.FlatArrayList.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatArrayList=} obj - * @returns {nd4j.graph.FlatArrayList} - */ -nd4j.graph.FlatArrayList.getRootAsFlatArrayList = function(bb, obj) { - return (obj || new nd4j.graph.FlatArrayList).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {number} index - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray} - */ -nd4j.graph.FlatArrayList.prototype.list = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatArrayList.prototype.listLength = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatArrayList.startFlatArrayList = function(builder) { - builder.startObject(1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} listOffset - */ -nd4j.graph.FlatArrayList.addList = function(builder, listOffset) { - builder.addFieldOffset(0, listOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatArrayList.createListVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatArrayList.startListVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatArrayList.endFlatArrayList = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.UIHistogram = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UIHistogram} - */ -nd4j.graph.UIHistogram.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UIHistogram=} obj - * @returns {nd4j.graph.UIHistogram} - */ -nd4j.graph.UIHistogram.getRootAsUIHistogram = function(bb, obj) { - return (obj || new nd4j.graph.UIHistogram).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {nd4j.graph.UIHistogramType} - */ -nd4j.graph.UIHistogram.prototype.type = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? /** @type {nd4j.graph.UIHistogramType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.UIHistogramType.DISCRETE; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIHistogram.prototype.numbins = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readUint32(this.bb_pos + offset) : 0; -}; - -/** - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray|null} - */ -nd4j.graph.UIHistogram.prototype.binranges = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray|null} - */ -nd4j.graph.UIHistogram.prototype.y = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIHistogram.prototype.binlabels = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIHistogram.prototype.binlabelsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UIHistogram.startUIHistogram = function(builder) { - builder.startObject(5); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.UIHistogramType} type - */ -nd4j.graph.UIHistogram.addType = function(builder, type) { - builder.addFieldInt8(0, type, nd4j.graph.UIHistogramType.DISCRETE); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numbins - */ -nd4j.graph.UIHistogram.addNumbins = function(builder, numbins) { - builder.addFieldInt32(1, numbins, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} binrangesOffset - */ -nd4j.graph.UIHistogram.addBinranges = function(builder, binrangesOffset) { - builder.addFieldOffset(2, binrangesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} yOffset - */ -nd4j.graph.UIHistogram.addY = function(builder, yOffset) { - builder.addFieldOffset(3, yOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} binlabelsOffset - */ -nd4j.graph.UIHistogram.addBinlabels = function(builder, binlabelsOffset) { - builder.addFieldOffset(4, binlabelsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIHistogram.createBinlabelsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIHistogram.startBinlabelsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIHistogram.endUIHistogram = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.UISummaryStatistics = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UISummaryStatistics} - */ -nd4j.graph.UISummaryStatistics.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UISummaryStatistics=} obj - * @returns {nd4j.graph.UISummaryStatistics} - */ -nd4j.graph.UISummaryStatistics.getRootAsUISummaryStatistics = function(bb, obj) { - return (obj || new nd4j.graph.UISummaryStatistics).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {number} - */ -nd4j.graph.UISummaryStatistics.prototype.bitmask = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readUint32(this.bb_pos + offset) : 0; -}; - -/** - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray|null} - */ -nd4j.graph.UISummaryStatistics.prototype.min = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray|null} - */ -nd4j.graph.UISummaryStatistics.prototype.max = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UISummaryStatistics.prototype.mean = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.readFloat64(this.bb_pos + offset) : 0.0; -}; - -/** - * @returns {number} - */ -nd4j.graph.UISummaryStatistics.prototype.stdev = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.readFloat64(this.bb_pos + offset) : 0.0; -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.UISummaryStatistics.prototype.countzero = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.UISummaryStatistics.prototype.countpositive = function() { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.UISummaryStatistics.prototype.countnegative = function() { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.UISummaryStatistics.prototype.countnan = function() { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.UISummaryStatistics.prototype.countinf = function() { - var offset = this.bb.__offset(this.bb_pos, 22); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UISummaryStatistics.startUISummaryStatistics = function(builder) { - builder.startObject(10); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} bitmask - */ -nd4j.graph.UISummaryStatistics.addBitmask = function(builder, bitmask) { - builder.addFieldInt32(0, bitmask, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} minOffset - */ -nd4j.graph.UISummaryStatistics.addMin = function(builder, minOffset) { - builder.addFieldOffset(1, minOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} maxOffset - */ -nd4j.graph.UISummaryStatistics.addMax = function(builder, maxOffset) { - builder.addFieldOffset(2, maxOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} mean - */ -nd4j.graph.UISummaryStatistics.addMean = function(builder, mean) { - builder.addFieldFloat64(3, mean, 0.0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} stdev - */ -nd4j.graph.UISummaryStatistics.addStdev = function(builder, stdev) { - builder.addFieldFloat64(4, stdev, 0.0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} countzero - */ -nd4j.graph.UISummaryStatistics.addCountzero = function(builder, countzero) { - builder.addFieldInt64(5, countzero, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} countpositive - */ -nd4j.graph.UISummaryStatistics.addCountpositive = function(builder, countpositive) { - builder.addFieldInt64(6, countpositive, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} countnegative - */ -nd4j.graph.UISummaryStatistics.addCountnegative = function(builder, countnegative) { - builder.addFieldInt64(7, countnegative, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} countnan - */ -nd4j.graph.UISummaryStatistics.addCountnan = function(builder, countnan) { - builder.addFieldInt64(8, countnan, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} countinf - */ -nd4j.graph.UISummaryStatistics.addCountinf = function(builder, countinf) { - builder.addFieldInt64(9, countinf, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UISummaryStatistics.endUISummaryStatistics = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.UIHardwareState = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UIHardwareState} - */ -nd4j.graph.UIHardwareState.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UIHardwareState=} obj - * @returns {nd4j.graph.UIHardwareState} - */ -nd4j.graph.UIHardwareState.getRootAsUIHardwareState = function(bb, obj) { - return (obj || new nd4j.graph.UIHardwareState).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {number} index - * @returns {flatbuffers.Long} - */ -nd4j.graph.UIHardwareState.prototype.gpuMemory = function(index) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb.__vector(this.bb_pos + offset) + index * 8) : this.bb.createLong(0, 0); -}; - -/** - * @returns {number} - */ -nd4j.graph.UIHardwareState.prototype.gpuMemoryLength = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.UIHardwareState.prototype.hostMemory = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UIHardwareState.startUIHardwareState = function(builder) { - builder.startObject(2); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} gpuMemoryOffset - */ -nd4j.graph.UIHardwareState.addGpuMemory = function(builder, gpuMemoryOffset) { - builder.addFieldOffset(0, gpuMemoryOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIHardwareState.createGpuMemoryVector = function(builder, data) { - builder.startVector(8, data.length, 8); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt64(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIHardwareState.startGpuMemoryVector = function(builder, numElems) { - builder.startVector(8, numElems, 8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} hostMemory - */ -nd4j.graph.UIHardwareState.addHostMemory = function(builder, hostMemory) { - builder.addFieldInt64(1, hostMemory, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIHardwareState.endUIHardwareState = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/uigraphstatic_generated.h b/libnd4j/include/graph/generated/uigraphstatic_generated.h deleted file mode 100644 index b6545f53a..000000000 --- a/libnd4j/include/graph/generated/uigraphstatic_generated.h +++ /dev/null @@ -1,571 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_UIGRAPHSTATIC_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_UIGRAPHSTATIC_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -#include "array_generated.h" -#include "utils_generated.h" -#include "variable_generated.h" - -namespace sd { -namespace graph { - -struct UIStaticInfoRecord; - -struct UISystemInfo; - -struct UIGraphStructure; - -struct UIVariable; - -struct UIOp; - -enum UIInfoType { - UIInfoType_GRAPH_STRUCTURE = 0, - UIInfoType_SYTEM_INFO = 1, - UIInfoType_START_EVENTS = 2, - UIInfoType_MIN = UIInfoType_GRAPH_STRUCTURE, - UIInfoType_MAX = UIInfoType_START_EVENTS -}; - -inline const UIInfoType (&EnumValuesUIInfoType())[3] { - static const UIInfoType values[] = { - UIInfoType_GRAPH_STRUCTURE, - UIInfoType_SYTEM_INFO, - UIInfoType_START_EVENTS - }; - return values; -} - -inline const char * const *EnumNamesUIInfoType() { - static const char * const names[] = { - "GRAPH_STRUCTURE", - "SYTEM_INFO", - "START_EVENTS", - nullptr - }; - return names; -} - -inline const char *EnumNameUIInfoType(UIInfoType e) { - const size_t index = static_cast(e); - return EnumNamesUIInfoType()[index]; -} - -struct UIStaticInfoRecord FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_INFOTYPE = 4 - }; - UIInfoType infoType() const { - return static_cast(GetField(VT_INFOTYPE, 0)); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_INFOTYPE) && - verifier.EndTable(); - } -}; - -struct UIStaticInfoRecordBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_infoType(UIInfoType infoType) { - fbb_.AddElement(UIStaticInfoRecord::VT_INFOTYPE, static_cast(infoType), 0); - } - explicit UIStaticInfoRecordBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UIStaticInfoRecordBuilder &operator=(const UIStaticInfoRecordBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUIStaticInfoRecord( - flatbuffers::FlatBufferBuilder &_fbb, - UIInfoType infoType = UIInfoType_GRAPH_STRUCTURE) { - UIStaticInfoRecordBuilder builder_(_fbb); - builder_.add_infoType(infoType); - return builder_.Finish(); -} - -struct UISystemInfo FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_PHYSICALCORES = 4 - }; - int32_t physicalCores() const { - return GetField(VT_PHYSICALCORES, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_PHYSICALCORES) && - verifier.EndTable(); - } -}; - -struct UISystemInfoBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_physicalCores(int32_t physicalCores) { - fbb_.AddElement(UISystemInfo::VT_PHYSICALCORES, physicalCores, 0); - } - explicit UISystemInfoBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UISystemInfoBuilder &operator=(const UISystemInfoBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUISystemInfo( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t physicalCores = 0) { - UISystemInfoBuilder builder_(_fbb); - builder_.add_physicalCores(physicalCores); - return builder_.Finish(); -} - -struct UIGraphStructure FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_INPUTS = 4, - VT_INPUTSPAIR = 6, - VT_OUTPUTS = 8, - VT_VARIABLES = 10, - VT_OPS = 12 - }; - const flatbuffers::Vector> *inputs() const { - return GetPointer> *>(VT_INPUTS); - } - const flatbuffers::Vector> *inputsPair() const { - return GetPointer> *>(VT_INPUTSPAIR); - } - const flatbuffers::Vector> *outputs() const { - return GetPointer> *>(VT_OUTPUTS); - } - const flatbuffers::Vector> *variables() const { - return GetPointer> *>(VT_VARIABLES); - } - const flatbuffers::Vector> *ops() const { - return GetPointer> *>(VT_OPS); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_INPUTS) && - verifier.VerifyVector(inputs()) && - verifier.VerifyVectorOfStrings(inputs()) && - VerifyOffset(verifier, VT_INPUTSPAIR) && - verifier.VerifyVector(inputsPair()) && - verifier.VerifyVectorOfTables(inputsPair()) && - VerifyOffset(verifier, VT_OUTPUTS) && - verifier.VerifyVector(outputs()) && - verifier.VerifyVectorOfStrings(outputs()) && - VerifyOffset(verifier, VT_VARIABLES) && - verifier.VerifyVector(variables()) && - verifier.VerifyVectorOfTables(variables()) && - VerifyOffset(verifier, VT_OPS) && - verifier.VerifyVector(ops()) && - verifier.VerifyVectorOfTables(ops()) && - verifier.EndTable(); - } -}; - -struct UIGraphStructureBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_inputs(flatbuffers::Offset>> inputs) { - fbb_.AddOffset(UIGraphStructure::VT_INPUTS, inputs); - } - void add_inputsPair(flatbuffers::Offset>> inputsPair) { - fbb_.AddOffset(UIGraphStructure::VT_INPUTSPAIR, inputsPair); - } - void add_outputs(flatbuffers::Offset>> outputs) { - fbb_.AddOffset(UIGraphStructure::VT_OUTPUTS, outputs); - } - void add_variables(flatbuffers::Offset>> variables) { - fbb_.AddOffset(UIGraphStructure::VT_VARIABLES, variables); - } - void add_ops(flatbuffers::Offset>> ops) { - fbb_.AddOffset(UIGraphStructure::VT_OPS, ops); - } - explicit UIGraphStructureBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UIGraphStructureBuilder &operator=(const UIGraphStructureBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUIGraphStructure( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset>> inputs = 0, - flatbuffers::Offset>> inputsPair = 0, - flatbuffers::Offset>> outputs = 0, - flatbuffers::Offset>> variables = 0, - flatbuffers::Offset>> ops = 0) { - UIGraphStructureBuilder builder_(_fbb); - builder_.add_ops(ops); - builder_.add_variables(variables); - builder_.add_outputs(outputs); - builder_.add_inputsPair(inputsPair); - builder_.add_inputs(inputs); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateUIGraphStructureDirect( - flatbuffers::FlatBufferBuilder &_fbb, - const std::vector> *inputs = nullptr, - const std::vector> *inputsPair = nullptr, - const std::vector> *outputs = nullptr, - const std::vector> *variables = nullptr, - const std::vector> *ops = nullptr) { - return sd::graph::CreateUIGraphStructure( - _fbb, - inputs ? _fbb.CreateVector>(*inputs) : 0, - inputsPair ? _fbb.CreateVector>(*inputsPair) : 0, - outputs ? _fbb.CreateVector>(*outputs) : 0, - variables ? _fbb.CreateVector>(*variables) : 0, - ops ? _fbb.CreateVector>(*ops) : 0); -} - -struct UIVariable FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4, - VT_NAME = 6, - VT_TYPE = 8, - VT_DATATYPE = 10, - VT_SHAPE = 12, - VT_CONTROLDEPS = 14, - VT_OUTPUTOFOP = 16, - VT_INPUTSFOROP = 18, - VT_CONTROLDEPSFOROP = 20, - VT_CONTROLDEPSFORVAR = 22, - VT_GRADIENTVARIABLE = 24, - VT_UILABELEXTRA = 26, - VT_CONSTANTVALUE = 28 - }; - const IntPair *id() const { - return GetPointer(VT_ID); - } - const flatbuffers::String *name() const { - return GetPointer(VT_NAME); - } - VarType type() const { - return static_cast(GetField(VT_TYPE, 0)); - } - DType datatype() const { - return static_cast(GetField(VT_DATATYPE, 0)); - } - const flatbuffers::Vector *shape() const { - return GetPointer *>(VT_SHAPE); - } - const flatbuffers::Vector> *controlDeps() const { - return GetPointer> *>(VT_CONTROLDEPS); - } - const flatbuffers::String *outputOfOp() const { - return GetPointer(VT_OUTPUTOFOP); - } - const flatbuffers::Vector> *inputsForOp() const { - return GetPointer> *>(VT_INPUTSFOROP); - } - const flatbuffers::Vector> *controlDepsForOp() const { - return GetPointer> *>(VT_CONTROLDEPSFOROP); - } - const flatbuffers::Vector> *controlDepsForVar() const { - return GetPointer> *>(VT_CONTROLDEPSFORVAR); - } - const flatbuffers::String *gradientVariable() const { - return GetPointer(VT_GRADIENTVARIABLE); - } - const flatbuffers::String *uiLabelExtra() const { - return GetPointer(VT_UILABELEXTRA); - } - const FlatArray *constantValue() const { - return GetPointer(VT_CONSTANTVALUE); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_ID) && - verifier.VerifyTable(id()) && - VerifyOffset(verifier, VT_NAME) && - verifier.VerifyString(name()) && - VerifyField(verifier, VT_TYPE) && - VerifyField(verifier, VT_DATATYPE) && - VerifyOffset(verifier, VT_SHAPE) && - verifier.VerifyVector(shape()) && - VerifyOffset(verifier, VT_CONTROLDEPS) && - verifier.VerifyVector(controlDeps()) && - verifier.VerifyVectorOfStrings(controlDeps()) && - VerifyOffset(verifier, VT_OUTPUTOFOP) && - verifier.VerifyString(outputOfOp()) && - VerifyOffset(verifier, VT_INPUTSFOROP) && - verifier.VerifyVector(inputsForOp()) && - verifier.VerifyVectorOfStrings(inputsForOp()) && - VerifyOffset(verifier, VT_CONTROLDEPSFOROP) && - verifier.VerifyVector(controlDepsForOp()) && - verifier.VerifyVectorOfStrings(controlDepsForOp()) && - VerifyOffset(verifier, VT_CONTROLDEPSFORVAR) && - verifier.VerifyVector(controlDepsForVar()) && - verifier.VerifyVectorOfStrings(controlDepsForVar()) && - VerifyOffset(verifier, VT_GRADIENTVARIABLE) && - verifier.VerifyString(gradientVariable()) && - VerifyOffset(verifier, VT_UILABELEXTRA) && - verifier.VerifyString(uiLabelExtra()) && - VerifyOffset(verifier, VT_CONSTANTVALUE) && - verifier.VerifyTable(constantValue()) && - verifier.EndTable(); - } -}; - -struct UIVariableBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(flatbuffers::Offset id) { - fbb_.AddOffset(UIVariable::VT_ID, id); - } - void add_name(flatbuffers::Offset name) { - fbb_.AddOffset(UIVariable::VT_NAME, name); - } - void add_type(VarType type) { - fbb_.AddElement(UIVariable::VT_TYPE, static_cast(type), 0); - } - void add_datatype(DType datatype) { - fbb_.AddElement(UIVariable::VT_DATATYPE, static_cast(datatype), 0); - } - void add_shape(flatbuffers::Offset> shape) { - fbb_.AddOffset(UIVariable::VT_SHAPE, shape); - } - void add_controlDeps(flatbuffers::Offset>> controlDeps) { - fbb_.AddOffset(UIVariable::VT_CONTROLDEPS, controlDeps); - } - void add_outputOfOp(flatbuffers::Offset outputOfOp) { - fbb_.AddOffset(UIVariable::VT_OUTPUTOFOP, outputOfOp); - } - void add_inputsForOp(flatbuffers::Offset>> inputsForOp) { - fbb_.AddOffset(UIVariable::VT_INPUTSFOROP, inputsForOp); - } - void add_controlDepsForOp(flatbuffers::Offset>> controlDepsForOp) { - fbb_.AddOffset(UIVariable::VT_CONTROLDEPSFOROP, controlDepsForOp); - } - void add_controlDepsForVar(flatbuffers::Offset>> controlDepsForVar) { - fbb_.AddOffset(UIVariable::VT_CONTROLDEPSFORVAR, controlDepsForVar); - } - void add_gradientVariable(flatbuffers::Offset gradientVariable) { - fbb_.AddOffset(UIVariable::VT_GRADIENTVARIABLE, gradientVariable); - } - void add_uiLabelExtra(flatbuffers::Offset uiLabelExtra) { - fbb_.AddOffset(UIVariable::VT_UILABELEXTRA, uiLabelExtra); - } - void add_constantValue(flatbuffers::Offset constantValue) { - fbb_.AddOffset(UIVariable::VT_CONSTANTVALUE, constantValue); - } - explicit UIVariableBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UIVariableBuilder &operator=(const UIVariableBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUIVariable( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset id = 0, - flatbuffers::Offset name = 0, - VarType type = VarType_VARIABLE, - DType datatype = DType_INHERIT, - flatbuffers::Offset> shape = 0, - flatbuffers::Offset>> controlDeps = 0, - flatbuffers::Offset outputOfOp = 0, - flatbuffers::Offset>> inputsForOp = 0, - flatbuffers::Offset>> controlDepsForOp = 0, - flatbuffers::Offset>> controlDepsForVar = 0, - flatbuffers::Offset gradientVariable = 0, - flatbuffers::Offset uiLabelExtra = 0, - flatbuffers::Offset constantValue = 0) { - UIVariableBuilder builder_(_fbb); - builder_.add_constantValue(constantValue); - builder_.add_uiLabelExtra(uiLabelExtra); - builder_.add_gradientVariable(gradientVariable); - builder_.add_controlDepsForVar(controlDepsForVar); - builder_.add_controlDepsForOp(controlDepsForOp); - builder_.add_inputsForOp(inputsForOp); - builder_.add_outputOfOp(outputOfOp); - builder_.add_controlDeps(controlDeps); - builder_.add_shape(shape); - builder_.add_name(name); - builder_.add_id(id); - builder_.add_datatype(datatype); - builder_.add_type(type); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateUIVariableDirect( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset id = 0, - const char *name = nullptr, - VarType type = VarType_VARIABLE, - DType datatype = DType_INHERIT, - const std::vector *shape = nullptr, - const std::vector> *controlDeps = nullptr, - const char *outputOfOp = nullptr, - const std::vector> *inputsForOp = nullptr, - const std::vector> *controlDepsForOp = nullptr, - const std::vector> *controlDepsForVar = nullptr, - const char *gradientVariable = nullptr, - const char *uiLabelExtra = nullptr, - flatbuffers::Offset constantValue = 0) { - return sd::graph::CreateUIVariable( - _fbb, - id, - name ? _fbb.CreateString(name) : 0, - type, - datatype, - shape ? _fbb.CreateVector(*shape) : 0, - controlDeps ? _fbb.CreateVector>(*controlDeps) : 0, - outputOfOp ? _fbb.CreateString(outputOfOp) : 0, - inputsForOp ? _fbb.CreateVector>(*inputsForOp) : 0, - controlDepsForOp ? _fbb.CreateVector>(*controlDepsForOp) : 0, - controlDepsForVar ? _fbb.CreateVector>(*controlDepsForVar) : 0, - gradientVariable ? _fbb.CreateString(gradientVariable) : 0, - uiLabelExtra ? _fbb.CreateString(uiLabelExtra) : 0, - constantValue); -} - -struct UIOp FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_NAME = 4, - VT_OPNAME = 6, - VT_INPUTS = 8, - VT_OUTPUTS = 10, - VT_CONTROLDEPS = 12, - VT_UILABELEXTRA = 14 - }; - const flatbuffers::String *name() const { - return GetPointer(VT_NAME); - } - const flatbuffers::String *opName() const { - return GetPointer(VT_OPNAME); - } - const flatbuffers::Vector> *inputs() const { - return GetPointer> *>(VT_INPUTS); - } - const flatbuffers::Vector> *outputs() const { - return GetPointer> *>(VT_OUTPUTS); - } - const flatbuffers::Vector> *controlDeps() const { - return GetPointer> *>(VT_CONTROLDEPS); - } - const flatbuffers::String *uiLabelExtra() const { - return GetPointer(VT_UILABELEXTRA); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_NAME) && - verifier.VerifyString(name()) && - VerifyOffset(verifier, VT_OPNAME) && - verifier.VerifyString(opName()) && - VerifyOffset(verifier, VT_INPUTS) && - verifier.VerifyVector(inputs()) && - verifier.VerifyVectorOfStrings(inputs()) && - VerifyOffset(verifier, VT_OUTPUTS) && - verifier.VerifyVector(outputs()) && - verifier.VerifyVectorOfStrings(outputs()) && - VerifyOffset(verifier, VT_CONTROLDEPS) && - verifier.VerifyVector(controlDeps()) && - verifier.VerifyVectorOfStrings(controlDeps()) && - VerifyOffset(verifier, VT_UILABELEXTRA) && - verifier.VerifyString(uiLabelExtra()) && - verifier.EndTable(); - } -}; - -struct UIOpBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_name(flatbuffers::Offset name) { - fbb_.AddOffset(UIOp::VT_NAME, name); - } - void add_opName(flatbuffers::Offset opName) { - fbb_.AddOffset(UIOp::VT_OPNAME, opName); - } - void add_inputs(flatbuffers::Offset>> inputs) { - fbb_.AddOffset(UIOp::VT_INPUTS, inputs); - } - void add_outputs(flatbuffers::Offset>> outputs) { - fbb_.AddOffset(UIOp::VT_OUTPUTS, outputs); - } - void add_controlDeps(flatbuffers::Offset>> controlDeps) { - fbb_.AddOffset(UIOp::VT_CONTROLDEPS, controlDeps); - } - void add_uiLabelExtra(flatbuffers::Offset uiLabelExtra) { - fbb_.AddOffset(UIOp::VT_UILABELEXTRA, uiLabelExtra); - } - explicit UIOpBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - UIOpBuilder &operator=(const UIOpBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateUIOp( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset name = 0, - flatbuffers::Offset opName = 0, - flatbuffers::Offset>> inputs = 0, - flatbuffers::Offset>> outputs = 0, - flatbuffers::Offset>> controlDeps = 0, - flatbuffers::Offset uiLabelExtra = 0) { - UIOpBuilder builder_(_fbb); - builder_.add_uiLabelExtra(uiLabelExtra); - builder_.add_controlDeps(controlDeps); - builder_.add_outputs(outputs); - builder_.add_inputs(inputs); - builder_.add_opName(opName); - builder_.add_name(name); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateUIOpDirect( - flatbuffers::FlatBufferBuilder &_fbb, - const char *name = nullptr, - const char *opName = nullptr, - const std::vector> *inputs = nullptr, - const std::vector> *outputs = nullptr, - const std::vector> *controlDeps = nullptr, - const char *uiLabelExtra = nullptr) { - return sd::graph::CreateUIOp( - _fbb, - name ? _fbb.CreateString(name) : 0, - opName ? _fbb.CreateString(opName) : 0, - inputs ? _fbb.CreateVector>(*inputs) : 0, - outputs ? _fbb.CreateVector>(*outputs) : 0, - controlDeps ? _fbb.CreateVector>(*controlDeps) : 0, - uiLabelExtra ? _fbb.CreateString(uiLabelExtra) : 0); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_UIGRAPHSTATIC_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/uigraphstatic_generated.js b/libnd4j/include/graph/generated/uigraphstatic_generated.js deleted file mode 100644 index bdfebac71..000000000 --- a/libnd4j/include/graph/generated/uigraphstatic_generated.js +++ /dev/null @@ -1,1125 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @enum - */ -nd4j.graph.UIInfoType = { - GRAPH_STRUCTURE: 0, - SYTEM_INFO: 1, - START_EVENTS: 2 -}; - -/** - * @constructor - */ -nd4j.graph.UIStaticInfoRecord = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UIStaticInfoRecord} - */ -nd4j.graph.UIStaticInfoRecord.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UIStaticInfoRecord=} obj - * @returns {nd4j.graph.UIStaticInfoRecord} - */ -nd4j.graph.UIStaticInfoRecord.getRootAsUIStaticInfoRecord = function(bb, obj) { - return (obj || new nd4j.graph.UIStaticInfoRecord).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {nd4j.graph.UIInfoType} - */ -nd4j.graph.UIStaticInfoRecord.prototype.infoType = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? /** @type {nd4j.graph.UIInfoType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.UIInfoType.GRAPH_STRUCTURE; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UIStaticInfoRecord.startUIStaticInfoRecord = function(builder) { - builder.startObject(1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.UIInfoType} infoType - */ -nd4j.graph.UIStaticInfoRecord.addInfoType = function(builder, infoType) { - builder.addFieldInt8(0, infoType, nd4j.graph.UIInfoType.GRAPH_STRUCTURE); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIStaticInfoRecord.endUIStaticInfoRecord = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.UISystemInfo = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UISystemInfo} - */ -nd4j.graph.UISystemInfo.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UISystemInfo=} obj - * @returns {nd4j.graph.UISystemInfo} - */ -nd4j.graph.UISystemInfo.getRootAsUISystemInfo = function(bb, obj) { - return (obj || new nd4j.graph.UISystemInfo).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {number} - */ -nd4j.graph.UISystemInfo.prototype.physicalCores = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UISystemInfo.startUISystemInfo = function(builder) { - builder.startObject(1); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} physicalCores - */ -nd4j.graph.UISystemInfo.addPhysicalCores = function(builder, physicalCores) { - builder.addFieldInt32(0, physicalCores, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UISystemInfo.endUISystemInfo = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.UIGraphStructure = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UIGraphStructure} - */ -nd4j.graph.UIGraphStructure.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UIGraphStructure=} obj - * @returns {nd4j.graph.UIGraphStructure} - */ -nd4j.graph.UIGraphStructure.getRootAsUIGraphStructure = function(bb, obj) { - return (obj || new nd4j.graph.UIGraphStructure).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIGraphStructure.prototype.inputs = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIGraphStructure.prototype.inputsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {nd4j.graph.IntPair=} obj - * @returns {nd4j.graph.IntPair} - */ -nd4j.graph.UIGraphStructure.prototype.inputsPair = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? (obj || new nd4j.graph.IntPair).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIGraphStructure.prototype.inputsPairLength = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIGraphStructure.prototype.outputs = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIGraphStructure.prototype.outputsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {nd4j.graph.UIVariable=} obj - * @returns {nd4j.graph.UIVariable} - */ -nd4j.graph.UIGraphStructure.prototype.variables = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? (obj || new nd4j.graph.UIVariable).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIGraphStructure.prototype.variablesLength = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {nd4j.graph.UIOp=} obj - * @returns {nd4j.graph.UIOp} - */ -nd4j.graph.UIGraphStructure.prototype.ops = function(index, obj) { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? (obj || new nd4j.graph.UIOp).__init(this.bb.__indirect(this.bb.__vector(this.bb_pos + offset) + index * 4), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIGraphStructure.prototype.opsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UIGraphStructure.startUIGraphStructure = function(builder) { - builder.startObject(5); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} inputsOffset - */ -nd4j.graph.UIGraphStructure.addInputs = function(builder, inputsOffset) { - builder.addFieldOffset(0, inputsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIGraphStructure.createInputsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIGraphStructure.startInputsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} inputsPairOffset - */ -nd4j.graph.UIGraphStructure.addInputsPair = function(builder, inputsPairOffset) { - builder.addFieldOffset(1, inputsPairOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIGraphStructure.createInputsPairVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIGraphStructure.startInputsPairVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} outputsOffset - */ -nd4j.graph.UIGraphStructure.addOutputs = function(builder, outputsOffset) { - builder.addFieldOffset(2, outputsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIGraphStructure.createOutputsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIGraphStructure.startOutputsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} variablesOffset - */ -nd4j.graph.UIGraphStructure.addVariables = function(builder, variablesOffset) { - builder.addFieldOffset(3, variablesOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIGraphStructure.createVariablesVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIGraphStructure.startVariablesVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} opsOffset - */ -nd4j.graph.UIGraphStructure.addOps = function(builder, opsOffset) { - builder.addFieldOffset(4, opsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIGraphStructure.createOpsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIGraphStructure.startOpsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIGraphStructure.endUIGraphStructure = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.UIVariable = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UIVariable} - */ -nd4j.graph.UIVariable.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UIVariable=} obj - * @returns {nd4j.graph.UIVariable} - */ -nd4j.graph.UIVariable.getRootAsUIVariable = function(bb, obj) { - return (obj || new nd4j.graph.UIVariable).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {nd4j.graph.IntPair=} obj - * @returns {nd4j.graph.IntPair|null} - */ -nd4j.graph.UIVariable.prototype.id = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? (obj || new nd4j.graph.IntPair).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UIVariable.prototype.name = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @returns {nd4j.graph.VarType} - */ -nd4j.graph.UIVariable.prototype.type = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? /** @type {nd4j.graph.VarType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.VarType.VARIABLE; -}; - -/** - * @returns {nd4j.graph.DType} - */ -nd4j.graph.UIVariable.prototype.datatype = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? /** @type {nd4j.graph.DType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.DType.INHERIT; -}; - -/** - * @param {number} index - * @returns {flatbuffers.Long} - */ -nd4j.graph.UIVariable.prototype.shape = function(index) { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.readInt64(this.bb.__vector(this.bb_pos + offset) + index * 8) : this.bb.createLong(0, 0); -}; - -/** - * @returns {number} - */ -nd4j.graph.UIVariable.prototype.shapeLength = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIVariable.prototype.controlDeps = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIVariable.prototype.controlDepsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UIVariable.prototype.outputOfOp = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIVariable.prototype.inputsForOp = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIVariable.prototype.inputsForOpLength = function() { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIVariable.prototype.controlDepsForOp = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIVariable.prototype.controlDepsForOpLength = function() { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIVariable.prototype.controlDepsForVar = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 22); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIVariable.prototype.controlDepsForVarLength = function() { - var offset = this.bb.__offset(this.bb_pos, 22); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UIVariable.prototype.gradientVariable = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 24); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UIVariable.prototype.uiLabelExtra = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 26); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray|null} - */ -nd4j.graph.UIVariable.prototype.constantValue = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 28); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UIVariable.startUIVariable = function(builder) { - builder.startObject(13); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} idOffset - */ -nd4j.graph.UIVariable.addId = function(builder, idOffset) { - builder.addFieldOffset(0, idOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} nameOffset - */ -nd4j.graph.UIVariable.addName = function(builder, nameOffset) { - builder.addFieldOffset(1, nameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.VarType} type - */ -nd4j.graph.UIVariable.addType = function(builder, type) { - builder.addFieldInt8(2, type, nd4j.graph.VarType.VARIABLE); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.DType} datatype - */ -nd4j.graph.UIVariable.addDatatype = function(builder, datatype) { - builder.addFieldInt8(3, datatype, nd4j.graph.DType.INHERIT); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} shapeOffset - */ -nd4j.graph.UIVariable.addShape = function(builder, shapeOffset) { - builder.addFieldOffset(4, shapeOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIVariable.createShapeVector = function(builder, data) { - builder.startVector(8, data.length, 8); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt64(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIVariable.startShapeVector = function(builder, numElems) { - builder.startVector(8, numElems, 8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepsOffset - */ -nd4j.graph.UIVariable.addControlDeps = function(builder, controlDepsOffset) { - builder.addFieldOffset(5, controlDepsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIVariable.createControlDepsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIVariable.startControlDepsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} outputOfOpOffset - */ -nd4j.graph.UIVariable.addOutputOfOp = function(builder, outputOfOpOffset) { - builder.addFieldOffset(6, outputOfOpOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} inputsForOpOffset - */ -nd4j.graph.UIVariable.addInputsForOp = function(builder, inputsForOpOffset) { - builder.addFieldOffset(7, inputsForOpOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIVariable.createInputsForOpVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIVariable.startInputsForOpVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepsForOpOffset - */ -nd4j.graph.UIVariable.addControlDepsForOp = function(builder, controlDepsForOpOffset) { - builder.addFieldOffset(8, controlDepsForOpOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIVariable.createControlDepsForOpVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIVariable.startControlDepsForOpVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepsForVarOffset - */ -nd4j.graph.UIVariable.addControlDepsForVar = function(builder, controlDepsForVarOffset) { - builder.addFieldOffset(9, controlDepsForVarOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIVariable.createControlDepsForVarVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIVariable.startControlDepsForVarVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} gradientVariableOffset - */ -nd4j.graph.UIVariable.addGradientVariable = function(builder, gradientVariableOffset) { - builder.addFieldOffset(10, gradientVariableOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} uiLabelExtraOffset - */ -nd4j.graph.UIVariable.addUiLabelExtra = function(builder, uiLabelExtraOffset) { - builder.addFieldOffset(11, uiLabelExtraOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} constantValueOffset - */ -nd4j.graph.UIVariable.addConstantValue = function(builder, constantValueOffset) { - builder.addFieldOffset(12, constantValueOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIVariable.endUIVariable = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.UIOp = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.UIOp} - */ -nd4j.graph.UIOp.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.UIOp=} obj - * @returns {nd4j.graph.UIOp} - */ -nd4j.graph.UIOp.getRootAsUIOp = function(bb, obj) { - return (obj || new nd4j.graph.UIOp).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UIOp.prototype.name = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UIOp.prototype.opName = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIOp.prototype.inputs = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIOp.prototype.inputsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIOp.prototype.outputs = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIOp.prototype.outputsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.UIOp.prototype.controlDeps = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.UIOp.prototype.controlDepsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.UIOp.prototype.uiLabelExtra = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.UIOp.startUIOp = function(builder) { - builder.startObject(6); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} nameOffset - */ -nd4j.graph.UIOp.addName = function(builder, nameOffset) { - builder.addFieldOffset(0, nameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} opNameOffset - */ -nd4j.graph.UIOp.addOpName = function(builder, opNameOffset) { - builder.addFieldOffset(1, opNameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} inputsOffset - */ -nd4j.graph.UIOp.addInputs = function(builder, inputsOffset) { - builder.addFieldOffset(2, inputsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIOp.createInputsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIOp.startInputsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} outputsOffset - */ -nd4j.graph.UIOp.addOutputs = function(builder, outputsOffset) { - builder.addFieldOffset(3, outputsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIOp.createOutputsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIOp.startOutputsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepsOffset - */ -nd4j.graph.UIOp.addControlDeps = function(builder, controlDepsOffset) { - builder.addFieldOffset(4, controlDepsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIOp.createControlDepsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.UIOp.startControlDepsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} uiLabelExtraOffset - */ -nd4j.graph.UIOp.addUiLabelExtra = function(builder, uiLabelExtraOffset) { - builder.addFieldOffset(5, uiLabelExtraOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.UIOp.endUIOp = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/utils_generated.h b/libnd4j/include/graph/generated/utils_generated.h deleted file mode 100644 index 8e7896bb4..000000000 --- a/libnd4j/include/graph/generated/utils_generated.h +++ /dev/null @@ -1,517 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_UTILS_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_UTILS_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -namespace sd { -namespace graph { - -struct LongPair; - -struct LongTriple; - -struct IntPair; - -struct IntTriple; - -enum OpType { - OpType_TRANSFORM_FLOAT = 0, - OpType_TRANSFORM_SAME = 1, - OpType_TRANSFORM_BOOL = 2, - OpType_TRANSFORM_STRICT = 3, - OpType_TRANSFORM_ANY = 4, - OpType_REDUCE_FLOAT = 5, - OpType_REDUCE_SAME = 6, - OpType_REDUCE_LONG = 7, - OpType_REDUCE_BOOL = 8, - OpType_INDEX_REDUCE = 9, - OpType_SCALAR = 10, - OpType_SCALAR_BOOL = 11, - OpType_BROADCAST = 12, - OpType_BROADCAST_BOOL = 13, - OpType_PAIRWISE = 14, - OpType_PAIRWISE_BOOL = 15, - OpType_REDUCE_3 = 16, - OpType_SUMMARYSTATS = 17, - OpType_SHAPE = 18, - OpType_AGGREGATION = 19, - OpType_RANDOM = 20, - OpType_CUSTOM = 21, - OpType_GRAPH = 22, - OpType_VARIABLE = 40, - OpType_BOOLEAN = 60, - OpType_LOGIC = 119, - OpType_MIN = OpType_TRANSFORM_FLOAT, - OpType_MAX = OpType_LOGIC -}; - -inline const OpType (&EnumValuesOpType())[26] { - static const OpType values[] = { - OpType_TRANSFORM_FLOAT, - OpType_TRANSFORM_SAME, - OpType_TRANSFORM_BOOL, - OpType_TRANSFORM_STRICT, - OpType_TRANSFORM_ANY, - OpType_REDUCE_FLOAT, - OpType_REDUCE_SAME, - OpType_REDUCE_LONG, - OpType_REDUCE_BOOL, - OpType_INDEX_REDUCE, - OpType_SCALAR, - OpType_SCALAR_BOOL, - OpType_BROADCAST, - OpType_BROADCAST_BOOL, - OpType_PAIRWISE, - OpType_PAIRWISE_BOOL, - OpType_REDUCE_3, - OpType_SUMMARYSTATS, - OpType_SHAPE, - OpType_AGGREGATION, - OpType_RANDOM, - OpType_CUSTOM, - OpType_GRAPH, - OpType_VARIABLE, - OpType_BOOLEAN, - OpType_LOGIC - }; - return values; -} - -inline const char * const *EnumNamesOpType() { - static const char * const names[] = { - "TRANSFORM_FLOAT", - "TRANSFORM_SAME", - "TRANSFORM_BOOL", - "TRANSFORM_STRICT", - "TRANSFORM_ANY", - "REDUCE_FLOAT", - "REDUCE_SAME", - "REDUCE_LONG", - "REDUCE_BOOL", - "INDEX_REDUCE", - "SCALAR", - "SCALAR_BOOL", - "BROADCAST", - "BROADCAST_BOOL", - "PAIRWISE", - "PAIRWISE_BOOL", - "REDUCE_3", - "SUMMARYSTATS", - "SHAPE", - "AGGREGATION", - "RANDOM", - "CUSTOM", - "GRAPH", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "VARIABLE", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "BOOLEAN", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "", - "LOGIC", - nullptr - }; - return names; -} - -inline const char *EnumNameOpType(OpType e) { - const size_t index = static_cast(e); - return EnumNamesOpType()[index]; -} - -enum InputType { - InputType_UNDEFINED = 0, - InputType_NUMERIC = 1, - InputType_STRINGULAR = 2, - InputType_NUMERIC_SET = 3, - InputType_STRINGULAR_SET = 4, - InputType_MIN = InputType_UNDEFINED, - InputType_MAX = InputType_STRINGULAR_SET -}; - -inline const InputType (&EnumValuesInputType())[5] { - static const InputType values[] = { - InputType_UNDEFINED, - InputType_NUMERIC, - InputType_STRINGULAR, - InputType_NUMERIC_SET, - InputType_STRINGULAR_SET - }; - return values; -} - -inline const char * const *EnumNamesInputType() { - static const char * const names[] = { - "UNDEFINED", - "NUMERIC", - "STRINGULAR", - "NUMERIC_SET", - "STRINGULAR_SET", - nullptr - }; - return names; -} - -inline const char *EnumNameInputType(InputType e) { - const size_t index = static_cast(e); - return EnumNamesInputType()[index]; -} - -enum OpClass { - OpClass_TRANSFORM = 0, - OpClass_REDUCTION = 1, - OpClass_MULTIPLICATOR = 2, - OpClass_GRAPH = 3, - OpClass_CONDITIONAL = 4, - OpClass_LOOP = 5, - OpClass_MIN = OpClass_TRANSFORM, - OpClass_MAX = OpClass_LOOP -}; - -inline const OpClass (&EnumValuesOpClass())[6] { - static const OpClass values[] = { - OpClass_TRANSFORM, - OpClass_REDUCTION, - OpClass_MULTIPLICATOR, - OpClass_GRAPH, - OpClass_CONDITIONAL, - OpClass_LOOP - }; - return values; -} - -inline const char * const *EnumNamesOpClass() { - static const char * const names[] = { - "TRANSFORM", - "REDUCTION", - "MULTIPLICATOR", - "GRAPH", - "CONDITIONAL", - "LOOP", - nullptr - }; - return names; -} - -inline const char *EnumNameOpClass(OpClass e) { - const size_t index = static_cast(e); - return EnumNamesOpClass()[index]; -} - -struct LongPair FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_FIRST = 4, - VT_SECOND = 6 - }; - int64_t first() const { - return GetField(VT_FIRST, 0); - } - int64_t second() const { - return GetField(VT_SECOND, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_FIRST) && - VerifyField(verifier, VT_SECOND) && - verifier.EndTable(); - } -}; - -struct LongPairBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_first(int64_t first) { - fbb_.AddElement(LongPair::VT_FIRST, first, 0); - } - void add_second(int64_t second) { - fbb_.AddElement(LongPair::VT_SECOND, second, 0); - } - explicit LongPairBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - LongPairBuilder &operator=(const LongPairBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateLongPair( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t first = 0, - int64_t second = 0) { - LongPairBuilder builder_(_fbb); - builder_.add_second(second); - builder_.add_first(first); - return builder_.Finish(); -} - -struct LongTriple FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_FIRST = 4, - VT_SECOND = 6, - VT_THIRD = 8 - }; - int64_t first() const { - return GetField(VT_FIRST, 0); - } - int64_t second() const { - return GetField(VT_SECOND, 0); - } - int64_t third() const { - return GetField(VT_THIRD, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_FIRST) && - VerifyField(verifier, VT_SECOND) && - VerifyField(verifier, VT_THIRD) && - verifier.EndTable(); - } -}; - -struct LongTripleBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_first(int64_t first) { - fbb_.AddElement(LongTriple::VT_FIRST, first, 0); - } - void add_second(int64_t second) { - fbb_.AddElement(LongTriple::VT_SECOND, second, 0); - } - void add_third(int64_t third) { - fbb_.AddElement(LongTriple::VT_THIRD, third, 0); - } - explicit LongTripleBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - LongTripleBuilder &operator=(const LongTripleBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateLongTriple( - flatbuffers::FlatBufferBuilder &_fbb, - int64_t first = 0, - int64_t second = 0, - int64_t third = 0) { - LongTripleBuilder builder_(_fbb); - builder_.add_third(third); - builder_.add_second(second); - builder_.add_first(first); - return builder_.Finish(); -} - -struct IntPair FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_FIRST = 4, - VT_SECOND = 6 - }; - int32_t first() const { - return GetField(VT_FIRST, 0); - } - int32_t second() const { - return GetField(VT_SECOND, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_FIRST) && - VerifyField(verifier, VT_SECOND) && - verifier.EndTable(); - } -}; - -struct IntPairBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_first(int32_t first) { - fbb_.AddElement(IntPair::VT_FIRST, first, 0); - } - void add_second(int32_t second) { - fbb_.AddElement(IntPair::VT_SECOND, second, 0); - } - explicit IntPairBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - IntPairBuilder &operator=(const IntPairBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateIntPair( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t first = 0, - int32_t second = 0) { - IntPairBuilder builder_(_fbb); - builder_.add_second(second); - builder_.add_first(first); - return builder_.Finish(); -} - -struct IntTriple FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_FIRST = 4, - VT_SECOND = 6, - VT_THIRD = 8 - }; - int32_t first() const { - return GetField(VT_FIRST, 0); - } - int32_t second() const { - return GetField(VT_SECOND, 0); - } - int32_t third() const { - return GetField(VT_THIRD, 0); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyField(verifier, VT_FIRST) && - VerifyField(verifier, VT_SECOND) && - VerifyField(verifier, VT_THIRD) && - verifier.EndTable(); - } -}; - -struct IntTripleBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_first(int32_t first) { - fbb_.AddElement(IntTriple::VT_FIRST, first, 0); - } - void add_second(int32_t second) { - fbb_.AddElement(IntTriple::VT_SECOND, second, 0); - } - void add_third(int32_t third) { - fbb_.AddElement(IntTriple::VT_THIRD, third, 0); - } - explicit IntTripleBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - IntTripleBuilder &operator=(const IntTripleBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateIntTriple( - flatbuffers::FlatBufferBuilder &_fbb, - int32_t first = 0, - int32_t second = 0, - int32_t third = 0) { - IntTripleBuilder builder_(_fbb); - builder_.add_third(third); - builder_.add_second(second); - builder_.add_first(first); - return builder_.Finish(); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_UTILS_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/utils_generated.js b/libnd4j/include/graph/generated/utils_generated.js deleted file mode 100644 index 9c0a86628..000000000 --- a/libnd4j/include/graph/generated/utils_generated.js +++ /dev/null @@ -1,453 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @enum - */ -nd4j.graph.OpType = { - TRANSFORM_FLOAT: 0, - TRANSFORM_SAME: 1, - TRANSFORM_BOOL: 2, - TRANSFORM_STRICT: 3, - TRANSFORM_ANY: 4, - REDUCE_FLOAT: 5, - REDUCE_SAME: 6, - REDUCE_LONG: 7, - REDUCE_BOOL: 8, - INDEX_REDUCE: 9, - SCALAR: 10, - SCALAR_BOOL: 11, - BROADCAST: 12, - BROADCAST_BOOL: 13, - PAIRWISE: 14, - PAIRWISE_BOOL: 15, - REDUCE_3: 16, - SUMMARYSTATS: 17, - SHAPE: 18, - AGGREGATION: 19, - RANDOM: 20, - CUSTOM: 21, - GRAPH: 22, - VARIABLE: 40, - BOOLEAN: 60, - LOGIC: 119 -}; - -/** - * @enum - */ -nd4j.graph.InputType = { - UNDEFINED: 0, - NUMERIC: 1, - STRINGULAR: 2, - NUMERIC_SET: 3, - STRINGULAR_SET: 4 -}; - -/** - * @enum - */ -nd4j.graph.OpClass = { - TRANSFORM: 0, - REDUCTION: 1, - MULTIPLICATOR: 2, - GRAPH: 3, - CONDITIONAL: 4, - LOOP: 5 -}; - -/** - * @constructor - */ -nd4j.graph.LongPair = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.LongPair} - */ -nd4j.graph.LongPair.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.LongPair=} obj - * @returns {nd4j.graph.LongPair} - */ -nd4j.graph.LongPair.getRootAsLongPair = function(bb, obj) { - return (obj || new nd4j.graph.LongPair).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.LongPair.prototype.first = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.LongPair.prototype.second = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.LongPair.startLongPair = function(builder) { - builder.startObject(2); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} first - */ -nd4j.graph.LongPair.addFirst = function(builder, first) { - builder.addFieldInt64(0, first, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} second - */ -nd4j.graph.LongPair.addSecond = function(builder, second) { - builder.addFieldInt64(1, second, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.LongPair.endLongPair = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.LongTriple = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.LongTriple} - */ -nd4j.graph.LongTriple.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.LongTriple=} obj - * @returns {nd4j.graph.LongTriple} - */ -nd4j.graph.LongTriple.getRootAsLongTriple = function(bb, obj) { - return (obj || new nd4j.graph.LongTriple).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.LongTriple.prototype.first = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.LongTriple.prototype.second = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @returns {flatbuffers.Long} - */ -nd4j.graph.LongTriple.prototype.third = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.readInt64(this.bb_pos + offset) : this.bb.createLong(0, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.LongTriple.startLongTriple = function(builder) { - builder.startObject(3); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} first - */ -nd4j.graph.LongTriple.addFirst = function(builder, first) { - builder.addFieldInt64(0, first, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} second - */ -nd4j.graph.LongTriple.addSecond = function(builder, second) { - builder.addFieldInt64(1, second, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Long} third - */ -nd4j.graph.LongTriple.addThird = function(builder, third) { - builder.addFieldInt64(2, third, builder.createLong(0, 0)); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.LongTriple.endLongTriple = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.IntPair = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.IntPair} - */ -nd4j.graph.IntPair.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.IntPair=} obj - * @returns {nd4j.graph.IntPair} - */ -nd4j.graph.IntPair.getRootAsIntPair = function(bb, obj) { - return (obj || new nd4j.graph.IntPair).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {number} - */ -nd4j.graph.IntPair.prototype.first = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.IntPair.prototype.second = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.IntPair.startIntPair = function(builder) { - builder.startObject(2); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} first - */ -nd4j.graph.IntPair.addFirst = function(builder, first) { - builder.addFieldInt32(0, first, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} second - */ -nd4j.graph.IntPair.addSecond = function(builder, second) { - builder.addFieldInt32(1, second, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.IntPair.endIntPair = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @constructor - */ -nd4j.graph.IntTriple = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.IntTriple} - */ -nd4j.graph.IntTriple.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.IntTriple=} obj - * @returns {nd4j.graph.IntTriple} - */ -nd4j.graph.IntTriple.getRootAsIntTriple = function(bb, obj) { - return (obj || new nd4j.graph.IntTriple).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @returns {number} - */ -nd4j.graph.IntTriple.prototype.first = function() { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.IntTriple.prototype.second = function() { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @returns {number} - */ -nd4j.graph.IntTriple.prototype.third = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.IntTriple.startIntTriple = function(builder) { - builder.startObject(3); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} first - */ -nd4j.graph.IntTriple.addFirst = function(builder, first) { - builder.addFieldInt32(0, first, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} second - */ -nd4j.graph.IntTriple.addSecond = function(builder, second) { - builder.addFieldInt32(1, second, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} third - */ -nd4j.graph.IntTriple.addThird = function(builder, third) { - builder.addFieldInt32(2, third, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.IntTriple.endIntTriple = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/generated/variable_generated.h b/libnd4j/include/graph/generated/variable_generated.h deleted file mode 100644 index a0e43a5af..000000000 --- a/libnd4j/include/graph/generated/variable_generated.h +++ /dev/null @@ -1,251 +0,0 @@ -// automatically generated by the FlatBuffers compiler, do not modify - - -#ifndef FLATBUFFERS_GENERATED_VARIABLE_ND4J_GRAPH_H_ -#define FLATBUFFERS_GENERATED_VARIABLE_ND4J_GRAPH_H_ - -#include "flatbuffers/flatbuffers.h" - -#include "array_generated.h" -#include "utils_generated.h" - -namespace sd { -namespace graph { - -struct FlatVariable; - -enum VarType { - VarType_VARIABLE = 0, - VarType_CONSTANT = 1, - VarType_ARRAY = 2, - VarType_PLACEHOLDER = 3, - VarType_MIN = VarType_VARIABLE, - VarType_MAX = VarType_PLACEHOLDER -}; - -inline const VarType (&EnumValuesVarType())[4] { - static const VarType values[] = { - VarType_VARIABLE, - VarType_CONSTANT, - VarType_ARRAY, - VarType_PLACEHOLDER - }; - return values; -} - -inline const char * const *EnumNamesVarType() { - static const char * const names[] = { - "VARIABLE", - "CONSTANT", - "ARRAY", - "PLACEHOLDER", - nullptr - }; - return names; -} - -inline const char *EnumNameVarType(VarType e) { - const size_t index = static_cast(e); - return EnumNamesVarType()[index]; -} - -struct FlatVariable FLATBUFFERS_FINAL_CLASS : private flatbuffers::Table { - enum { - VT_ID = 4, - VT_NAME = 6, - VT_DTYPE = 8, - VT_SHAPE = 10, - VT_NDARRAY = 12, - VT_DEVICE = 14, - VT_VARIABLETYPE = 16, - VT_CONTROLDEPS = 18, - VT_CONTROLDEPFOROP = 20, - VT_CONTROLDEPSFORVAR = 22 - }; - const IntPair *id() const { - return GetPointer(VT_ID); - } - const flatbuffers::String *name() const { - return GetPointer(VT_NAME); - } - DType dtype() const { - return static_cast(GetField(VT_DTYPE, 0)); - } - const flatbuffers::Vector *shape() const { - return GetPointer *>(VT_SHAPE); - } - const FlatArray *ndarray() const { - return GetPointer(VT_NDARRAY); - } - int32_t device() const { - return GetField(VT_DEVICE, 0); - } - VarType variabletype() const { - return static_cast(GetField(VT_VARIABLETYPE, 0)); - } - const flatbuffers::Vector> *controlDeps() const { - return GetPointer> *>(VT_CONTROLDEPS); - } - const flatbuffers::Vector> *controlDepForOp() const { - return GetPointer> *>(VT_CONTROLDEPFOROP); - } - const flatbuffers::Vector> *controlDepsForVar() const { - return GetPointer> *>(VT_CONTROLDEPSFORVAR); - } - bool Verify(flatbuffers::Verifier &verifier) const { - return VerifyTableStart(verifier) && - VerifyOffset(verifier, VT_ID) && - verifier.VerifyTable(id()) && - VerifyOffset(verifier, VT_NAME) && - verifier.VerifyString(name()) && - VerifyField(verifier, VT_DTYPE) && - VerifyOffset(verifier, VT_SHAPE) && - verifier.VerifyVector(shape()) && - VerifyOffset(verifier, VT_NDARRAY) && - verifier.VerifyTable(ndarray()) && - VerifyField(verifier, VT_DEVICE) && - VerifyField(verifier, VT_VARIABLETYPE) && - VerifyOffset(verifier, VT_CONTROLDEPS) && - verifier.VerifyVector(controlDeps()) && - verifier.VerifyVectorOfStrings(controlDeps()) && - VerifyOffset(verifier, VT_CONTROLDEPFOROP) && - verifier.VerifyVector(controlDepForOp()) && - verifier.VerifyVectorOfStrings(controlDepForOp()) && - VerifyOffset(verifier, VT_CONTROLDEPSFORVAR) && - verifier.VerifyVector(controlDepsForVar()) && - verifier.VerifyVectorOfStrings(controlDepsForVar()) && - verifier.EndTable(); - } -}; - -struct FlatVariableBuilder { - flatbuffers::FlatBufferBuilder &fbb_; - flatbuffers::uoffset_t start_; - void add_id(flatbuffers::Offset id) { - fbb_.AddOffset(FlatVariable::VT_ID, id); - } - void add_name(flatbuffers::Offset name) { - fbb_.AddOffset(FlatVariable::VT_NAME, name); - } - void add_dtype(DType dtype) { - fbb_.AddElement(FlatVariable::VT_DTYPE, static_cast(dtype), 0); - } - void add_shape(flatbuffers::Offset> shape) { - fbb_.AddOffset(FlatVariable::VT_SHAPE, shape); - } - void add_ndarray(flatbuffers::Offset ndarray) { - fbb_.AddOffset(FlatVariable::VT_NDARRAY, ndarray); - } - void add_device(int32_t device) { - fbb_.AddElement(FlatVariable::VT_DEVICE, device, 0); - } - void add_variabletype(VarType variabletype) { - fbb_.AddElement(FlatVariable::VT_VARIABLETYPE, static_cast(variabletype), 0); - } - void add_controlDeps(flatbuffers::Offset>> controlDeps) { - fbb_.AddOffset(FlatVariable::VT_CONTROLDEPS, controlDeps); - } - void add_controlDepForOp(flatbuffers::Offset>> controlDepForOp) { - fbb_.AddOffset(FlatVariable::VT_CONTROLDEPFOROP, controlDepForOp); - } - void add_controlDepsForVar(flatbuffers::Offset>> controlDepsForVar) { - fbb_.AddOffset(FlatVariable::VT_CONTROLDEPSFORVAR, controlDepsForVar); - } - explicit FlatVariableBuilder(flatbuffers::FlatBufferBuilder &_fbb) - : fbb_(_fbb) { - start_ = fbb_.StartTable(); - } - FlatVariableBuilder &operator=(const FlatVariableBuilder &); - flatbuffers::Offset Finish() { - const auto end = fbb_.EndTable(start_); - auto o = flatbuffers::Offset(end); - return o; - } -}; - -inline flatbuffers::Offset CreateFlatVariable( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset id = 0, - flatbuffers::Offset name = 0, - DType dtype = DType_INHERIT, - flatbuffers::Offset> shape = 0, - flatbuffers::Offset ndarray = 0, - int32_t device = 0, - VarType variabletype = VarType_VARIABLE, - flatbuffers::Offset>> controlDeps = 0, - flatbuffers::Offset>> controlDepForOp = 0, - flatbuffers::Offset>> controlDepsForVar = 0) { - FlatVariableBuilder builder_(_fbb); - builder_.add_controlDepsForVar(controlDepsForVar); - builder_.add_controlDepForOp(controlDepForOp); - builder_.add_controlDeps(controlDeps); - builder_.add_device(device); - builder_.add_ndarray(ndarray); - builder_.add_shape(shape); - builder_.add_name(name); - builder_.add_id(id); - builder_.add_variabletype(variabletype); - builder_.add_dtype(dtype); - return builder_.Finish(); -} - -inline flatbuffers::Offset CreateFlatVariableDirect( - flatbuffers::FlatBufferBuilder &_fbb, - flatbuffers::Offset id = 0, - const char *name = nullptr, - DType dtype = DType_INHERIT, - const std::vector *shape = nullptr, - flatbuffers::Offset ndarray = 0, - int32_t device = 0, - VarType variabletype = VarType_VARIABLE, - const std::vector> *controlDeps = nullptr, - const std::vector> *controlDepForOp = nullptr, - const std::vector> *controlDepsForVar = nullptr) { - return sd::graph::CreateFlatVariable( - _fbb, - id, - name ? _fbb.CreateString(name) : 0, - dtype, - shape ? _fbb.CreateVector(*shape) : 0, - ndarray, - device, - variabletype, - controlDeps ? _fbb.CreateVector>(*controlDeps) : 0, - controlDepForOp ? _fbb.CreateVector>(*controlDepForOp) : 0, - controlDepsForVar ? _fbb.CreateVector>(*controlDepsForVar) : 0); -} - -inline const sd::graph::FlatVariable *GetFlatVariable(const void *buf) { - return flatbuffers::GetRoot(buf); -} - -inline const sd::graph::FlatVariable *GetSizePrefixedFlatVariable(const void *buf) { - return flatbuffers::GetSizePrefixedRoot(buf); -} - -inline bool VerifyFlatVariableBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifyBuffer(nullptr); -} - -inline bool VerifySizePrefixedFlatVariableBuffer( - flatbuffers::Verifier &verifier) { - return verifier.VerifySizePrefixedBuffer(nullptr); -} - -inline void FinishFlatVariableBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.Finish(root); -} - -inline void FinishSizePrefixedFlatVariableBuffer( - flatbuffers::FlatBufferBuilder &fbb, - flatbuffers::Offset root) { - fbb.FinishSizePrefixed(root); -} - -} // namespace graph -} // namespace sd - -#endif // FLATBUFFERS_GENERATED_VARIABLE_ND4J_GRAPH_H_ diff --git a/libnd4j/include/graph/generated/variable_generated.js b/libnd4j/include/graph/generated/variable_generated.js deleted file mode 100644 index 5b5ce1c10..000000000 --- a/libnd4j/include/graph/generated/variable_generated.js +++ /dev/null @@ -1,389 +0,0 @@ -/* - * ****************************************************************************** - * * - * * - * * This program and the accompanying materials are made available under the - * * terms of the Apache License, Version 2.0 which is available at - * * https://www.apache.org/licenses/LICENSE-2.0. - * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. - * * Unless required by applicable law or agreed to in writing, software - * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * * License for the specific language governing permissions and limitations - * * under the License. - * * - * * SPDX-License-Identifier: Apache-2.0 - * ***************************************************************************** - */ - -/** - * @const - * @namespace - */ -var nd4j = nd4j || {}; - -/** - * @const - * @namespace - */ -nd4j.graph = nd4j.graph || {}; - -/** - * @enum - */ -nd4j.graph.VarType = { - VARIABLE: 0, - CONSTANT: 1, - ARRAY: 2, - PLACEHOLDER: 3 -}; - -/** - * @constructor - */ -nd4j.graph.FlatVariable = function() { - /** - * @type {flatbuffers.ByteBuffer} - */ - this.bb = null; - - /** - * @type {number} - */ - this.bb_pos = 0; -}; - -/** - * @param {number} i - * @param {flatbuffers.ByteBuffer} bb - * @returns {nd4j.graph.FlatVariable} - */ -nd4j.graph.FlatVariable.prototype.__init = function(i, bb) { - this.bb_pos = i; - this.bb = bb; - return this; -}; - -/** - * @param {flatbuffers.ByteBuffer} bb - * @param {nd4j.graph.FlatVariable=} obj - * @returns {nd4j.graph.FlatVariable} - */ -nd4j.graph.FlatVariable.getRootAsFlatVariable = function(bb, obj) { - return (obj || new nd4j.graph.FlatVariable).__init(bb.readInt32(bb.position()) + bb.position(), bb); -}; - -/** - * @param {nd4j.graph.IntPair=} obj - * @returns {nd4j.graph.IntPair|null} - */ -nd4j.graph.FlatVariable.prototype.id = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 4); - return offset ? (obj || new nd4j.graph.IntPair).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array|null} - */ -nd4j.graph.FlatVariable.prototype.name = function(optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 6); - return offset ? this.bb.__string(this.bb_pos + offset, optionalEncoding) : null; -}; - -/** - * @returns {nd4j.graph.DType} - */ -nd4j.graph.FlatVariable.prototype.dtype = function() { - var offset = this.bb.__offset(this.bb_pos, 8); - return offset ? /** @type {nd4j.graph.DType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.DType.INHERIT; -}; - -/** - * @param {number} index - * @returns {flatbuffers.Long} - */ -nd4j.graph.FlatVariable.prototype.shape = function(index) { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.readInt64(this.bb.__vector(this.bb_pos + offset) + index * 8) : this.bb.createLong(0, 0); -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatVariable.prototype.shapeLength = function() { - var offset = this.bb.__offset(this.bb_pos, 10); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {nd4j.graph.FlatArray=} obj - * @returns {nd4j.graph.FlatArray|null} - */ -nd4j.graph.FlatVariable.prototype.ndarray = function(obj) { - var offset = this.bb.__offset(this.bb_pos, 12); - return offset ? (obj || new nd4j.graph.FlatArray).__init(this.bb.__indirect(this.bb_pos + offset), this.bb) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatVariable.prototype.device = function() { - var offset = this.bb.__offset(this.bb_pos, 14); - return offset ? this.bb.readInt32(this.bb_pos + offset) : 0; -}; - -/** - * @returns {nd4j.graph.VarType} - */ -nd4j.graph.FlatVariable.prototype.variabletype = function() { - var offset = this.bb.__offset(this.bb_pos, 16); - return offset ? /** @type {nd4j.graph.VarType} */ (this.bb.readInt8(this.bb_pos + offset)) : nd4j.graph.VarType.VARIABLE; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatVariable.prototype.controlDeps = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatVariable.prototype.controlDepsLength = function() { - var offset = this.bb.__offset(this.bb_pos, 18); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatVariable.prototype.controlDepForOp = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatVariable.prototype.controlDepForOpLength = function() { - var offset = this.bb.__offset(this.bb_pos, 20); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {number} index - * @param {flatbuffers.Encoding=} optionalEncoding - * @returns {string|Uint8Array} - */ -nd4j.graph.FlatVariable.prototype.controlDepsForVar = function(index, optionalEncoding) { - var offset = this.bb.__offset(this.bb_pos, 22); - return offset ? this.bb.__string(this.bb.__vector(this.bb_pos + offset) + index * 4, optionalEncoding) : null; -}; - -/** - * @returns {number} - */ -nd4j.graph.FlatVariable.prototype.controlDepsForVarLength = function() { - var offset = this.bb.__offset(this.bb_pos, 22); - return offset ? this.bb.__vector_len(this.bb_pos + offset) : 0; -}; - -/** - * @param {flatbuffers.Builder} builder - */ -nd4j.graph.FlatVariable.startFlatVariable = function(builder) { - builder.startObject(10); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} idOffset - */ -nd4j.graph.FlatVariable.addId = function(builder, idOffset) { - builder.addFieldOffset(0, idOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} nameOffset - */ -nd4j.graph.FlatVariable.addName = function(builder, nameOffset) { - builder.addFieldOffset(1, nameOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.DType} dtype - */ -nd4j.graph.FlatVariable.addDtype = function(builder, dtype) { - builder.addFieldInt8(2, dtype, nd4j.graph.DType.INHERIT); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} shapeOffset - */ -nd4j.graph.FlatVariable.addShape = function(builder, shapeOffset) { - builder.addFieldOffset(3, shapeOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatVariable.createShapeVector = function(builder, data) { - builder.startVector(8, data.length, 8); - for (var i = data.length - 1; i >= 0; i--) { - builder.addInt64(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatVariable.startShapeVector = function(builder, numElems) { - builder.startVector(8, numElems, 8); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} ndarrayOffset - */ -nd4j.graph.FlatVariable.addNdarray = function(builder, ndarrayOffset) { - builder.addFieldOffset(4, ndarrayOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} device - */ -nd4j.graph.FlatVariable.addDevice = function(builder, device) { - builder.addFieldInt32(5, device, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {nd4j.graph.VarType} variabletype - */ -nd4j.graph.FlatVariable.addVariabletype = function(builder, variabletype) { - builder.addFieldInt8(6, variabletype, nd4j.graph.VarType.VARIABLE); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepsOffset - */ -nd4j.graph.FlatVariable.addControlDeps = function(builder, controlDepsOffset) { - builder.addFieldOffset(7, controlDepsOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatVariable.createControlDepsVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatVariable.startControlDepsVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepForOpOffset - */ -nd4j.graph.FlatVariable.addControlDepForOp = function(builder, controlDepForOpOffset) { - builder.addFieldOffset(8, controlDepForOpOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatVariable.createControlDepForOpVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatVariable.startControlDepForOpVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} controlDepsForVarOffset - */ -nd4j.graph.FlatVariable.addControlDepsForVar = function(builder, controlDepsForVarOffset) { - builder.addFieldOffset(9, controlDepsForVarOffset, 0); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {Array.} data - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatVariable.createControlDepsForVarVector = function(builder, data) { - builder.startVector(4, data.length, 4); - for (var i = data.length - 1; i >= 0; i--) { - builder.addOffset(data[i]); - } - return builder.endVector(); -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {number} numElems - */ -nd4j.graph.FlatVariable.startControlDepsForVarVector = function(builder, numElems) { - builder.startVector(4, numElems, 4); -}; - -/** - * @param {flatbuffers.Builder} builder - * @returns {flatbuffers.Offset} - */ -nd4j.graph.FlatVariable.endFlatVariable = function(builder) { - var offset = builder.endObject(); - return offset; -}; - -/** - * @param {flatbuffers.Builder} builder - * @param {flatbuffers.Offset} offset - */ -nd4j.graph.FlatVariable.finishFlatVariableBuffer = function(builder, offset) { - builder.finish(offset); -}; - -// Exports for Node.js and RequireJS -this.nd4j = nd4j; diff --git a/libnd4j/include/graph/scheme/uigraphevents.fbs b/libnd4j/include/graph/scheme/uigraphevents.fbs index eb9fa13d6..6881dc7e7 100644 --- a/libnd4j/include/graph/scheme/uigraphevents.fbs +++ b/libnd4j/include/graph/scheme/uigraphevents.fbs @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ include "array.fbs"; //For FlatArray diff --git a/libnd4j/include/graph/scheme/uigraphstatic.fbs b/libnd4j/include/graph/scheme/uigraphstatic.fbs index b0b19ce17..7fce7b3e1 100644 --- a/libnd4j/include/graph/scheme/uigraphstatic.fbs +++ b/libnd4j/include/graph/scheme/uigraphstatic.fbs @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ include "utils.fbs"; //For: IntPair include "variable.fbs"; //For: VarType diff --git a/libnd4j/include/helpers/AttentionHelper.h b/libnd4j/include/helpers/AttentionHelper.h index 02d9da995..5d7474664 100644 --- a/libnd4j/include/helpers/AttentionHelper.h +++ b/libnd4j/include/helpers/AttentionHelper.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Paul Dubs diff --git a/libnd4j/include/helpers/MKLDNNStream.h b/libnd4j/include/helpers/MKLDNNStream.h index f575c48d9..cddb149d8 100644 --- a/libnd4j/include/helpers/MKLDNNStream.h +++ b/libnd4j/include/helpers/MKLDNNStream.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by saudet on 8/30/2018. diff --git a/libnd4j/include/helpers/MmulHelper.h b/libnd4j/include/helpers/MmulHelper.h index 517ca9888..3ed872e7b 100644 --- a/libnd4j/include/helpers/MmulHelper.h +++ b/libnd4j/include/helpers/MmulHelper.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com), created on 05.06.2018 diff --git a/libnd4j/include/helpers/OpBenchmark.h b/libnd4j/include/helpers/OpBenchmark.h index 328b20dce..d8fa2111d 100644 --- a/libnd4j/include/helpers/OpBenchmark.h +++ b/libnd4j/include/helpers/OpBenchmark.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver on 2/28/2019. diff --git a/libnd4j/include/helpers/StringUtils.h b/libnd4j/include/helpers/StringUtils.h index e5f9f2990..722c68176 100644 --- a/libnd4j/include/helpers/StringUtils.h +++ b/libnd4j/include/helpers/StringUtils.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver119 on 20/04/18. diff --git a/libnd4j/include/helpers/cpu/ConstantHelper.cpp b/libnd4j/include/helpers/cpu/ConstantHelper.cpp index be6eff65c..8b6937b18 100644 --- a/libnd4j/include/helpers/cpu/ConstantHelper.cpp +++ b/libnd4j/include/helpers/cpu/ConstantHelper.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/helpers/cpu/loops/IndexReductionLoops_int32.cpp.in b/libnd4j/include/helpers/cpu/loops/IndexReductionLoops_int32.cpp.in index 2030c8017..2516adf91 100644 --- a/libnd4j/include/helpers/cpu/loops/IndexReductionLoops_int32.cpp.in +++ b/libnd4j/include/helpers/cpu/loops/IndexReductionLoops_int32.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/helpers/cpu/loops/IndexReductionLoops_int64.cpp.in b/libnd4j/include/helpers/cpu/loops/IndexReductionLoops_int64.cpp.in index 0647ce17d..82c4714e3 100644 --- a/libnd4j/include/helpers/cpu/loops/IndexReductionLoops_int64.cpp.in +++ b/libnd4j/include/helpers/cpu/loops/IndexReductionLoops_int64.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/helpers/cuda_off/MmulHelper.cu b/libnd4j/include/helpers/cuda_off/MmulHelper.cu index 36f48184a..7e9a336ef 100644 --- a/libnd4j/include/helpers/cuda_off/MmulHelper.cu +++ b/libnd4j/include/helpers/cuda_off/MmulHelper.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/helpers/impl/AttentionHelper.cpp b/libnd4j/include/helpers/impl/AttentionHelper.cpp index bd5d006f2..d32cd65ee 100644 --- a/libnd4j/include/helpers/impl/AttentionHelper.cpp +++ b/libnd4j/include/helpers/impl/AttentionHelper.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Paul Dubs diff --git a/libnd4j/include/helpers/impl/OpBenchmark.cpp b/libnd4j/include/helpers/impl/OpBenchmark.cpp index 6cb0dc08a..be594cbe4 100644 --- a/libnd4j/include/helpers/impl/OpBenchmark.cpp +++ b/libnd4j/include/helpers/impl/OpBenchmark.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver on 2/28/2019. diff --git a/libnd4j/include/helpers/impl/Parameters.cpp b/libnd4j/include/helpers/impl/Parameters.cpp index 356ad5a5a..0a7d7b332 100644 --- a/libnd4j/include/helpers/impl/Parameters.cpp +++ b/libnd4j/include/helpers/impl/Parameters.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver on 2/28/2019. diff --git a/libnd4j/include/helpers/impl/StringUtils.cpp b/libnd4j/include/helpers/impl/StringUtils.cpp index 757def763..6a22be847 100644 --- a/libnd4j/include/helpers/impl/StringUtils.cpp +++ b/libnd4j/include/helpers/impl/StringUtils.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver119 on 20/04/18. diff --git a/libnd4j/include/helpers/unicode.h b/libnd4j/include/helpers/unicode.h index 6db4841db..06112a25a 100644 --- a/libnd4j/include/helpers/unicode.h +++ b/libnd4j/include/helpers/unicode.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleg Semeniv diff --git a/libnd4j/include/loops/cpu/compilation_units/indexreduce_int32.cpp.in b/libnd4j/include/loops/cpu/compilation_units/indexreduce_int32.cpp.in index 97402d38e..269587776 100644 --- a/libnd4j/include/loops/cpu/compilation_units/indexreduce_int32.cpp.in +++ b/libnd4j/include/loops/cpu/compilation_units/indexreduce_int32.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/loops/cpu/compilation_units/indexreduce_int64.cpp.in b/libnd4j/include/loops/cpu/compilation_units/indexreduce_int64.cpp.in index 30fa30749..333bd0c12 100644 --- a/libnd4j/include/loops/cpu/compilation_units/indexreduce_int64.cpp.in +++ b/libnd4j/include/loops/cpu/compilation_units/indexreduce_int64.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/loops/cpu/compilation_units/reduce3_bfloat16.cpp.in b/libnd4j/include/loops/cpu/compilation_units/reduce3_bfloat16.cpp.in index 68616c3f9..e2836a717 100644 --- a/libnd4j/include/loops/cpu/compilation_units/reduce3_bfloat16.cpp.in +++ b/libnd4j/include/loops/cpu/compilation_units/reduce3_bfloat16.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/loops/cpu/compilation_units/reduce3_double.cpp.in b/libnd4j/include/loops/cpu/compilation_units/reduce3_double.cpp.in index 5c722838d..c4ee56d77 100644 --- a/libnd4j/include/loops/cpu/compilation_units/reduce3_double.cpp.in +++ b/libnd4j/include/loops/cpu/compilation_units/reduce3_double.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/loops/cpu/compilation_units/reduce3_float.cpp.in b/libnd4j/include/loops/cpu/compilation_units/reduce3_float.cpp.in index ee127c2d9..b290d5279 100644 --- a/libnd4j/include/loops/cpu/compilation_units/reduce3_float.cpp.in +++ b/libnd4j/include/loops/cpu/compilation_units/reduce3_float.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/loops/cpu/compilation_units/reduce3_float16.cpp.in b/libnd4j/include/loops/cpu/compilation_units/reduce3_float16.cpp.in index 65c2b563a..612f2c471 100644 --- a/libnd4j/include/loops/cpu/compilation_units/reduce3_float16.cpp.in +++ b/libnd4j/include/loops/cpu/compilation_units/reduce3_float16.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/loops/cpu/compilation_units/reduce_float.cpp.in b/libnd4j/include/loops/cpu/compilation_units/reduce_float.cpp.in index 3837c7810..6d8e7c452 100644 --- a/libnd4j/include/loops/cpu/compilation_units/reduce_float.cpp.in +++ b/libnd4j/include/loops/cpu/compilation_units/reduce_float.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/math/templatemath.h b/libnd4j/include/math/templatemath.h index 0bfb4d511..24c27a157 100644 --- a/libnd4j/include/math/templatemath.h +++ b/libnd4j/include/math/templatemath.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ /* * templatemath.h diff --git a/libnd4j/include/ops/declarable/generic/bitwise/bits_hamming_distance.cpp b/libnd4j/include/ops/declarable/generic/bitwise/bits_hamming_distance.cpp index 693ebf7c6..e9aa289fe 100644 --- a/libnd4j/include/ops/declarable/generic/bitwise/bits_hamming_distance.cpp +++ b/libnd4j/include/ops/declarable/generic/bitwise/bits_hamming_distance.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/bitwise/bitwise_and.cpp b/libnd4j/include/ops/declarable/generic/bitwise/bitwise_and.cpp index 1e951c1d9..c53a8fbd6 100644 --- a/libnd4j/include/ops/declarable/generic/bitwise/bitwise_and.cpp +++ b/libnd4j/include/ops/declarable/generic/bitwise/bitwise_and.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/bitwise/bitwise_or.cpp b/libnd4j/include/ops/declarable/generic/bitwise/bitwise_or.cpp index cd20a8434..57f20de2d 100644 --- a/libnd4j/include/ops/declarable/generic/bitwise/bitwise_or.cpp +++ b/libnd4j/include/ops/declarable/generic/bitwise/bitwise_or.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/bitwise/bitwise_xor.cpp b/libnd4j/include/ops/declarable/generic/bitwise/bitwise_xor.cpp index 0af9fe759..6a0401c6e 100644 --- a/libnd4j/include/ops/declarable/generic/bitwise/bitwise_xor.cpp +++ b/libnd4j/include/ops/declarable/generic/bitwise/bitwise_xor.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/bitwise/cyclic_rshift.cpp b/libnd4j/include/ops/declarable/generic/bitwise/cyclic_rshift.cpp index cc0c4827b..94205bf39 100644 --- a/libnd4j/include/ops/declarable/generic/bitwise/cyclic_rshift.cpp +++ b/libnd4j/include/ops/declarable/generic/bitwise/cyclic_rshift.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/bitwise/cyclic_shift.cpp b/libnd4j/include/ops/declarable/generic/bitwise/cyclic_shift.cpp index f2b36a6d8..f87d91e2a 100644 --- a/libnd4j/include/ops/declarable/generic/bitwise/cyclic_shift.cpp +++ b/libnd4j/include/ops/declarable/generic/bitwise/cyclic_shift.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/bitwise/rshift.cpp b/libnd4j/include/ops/declarable/generic/bitwise/rshift.cpp index 8b44d2a6f..bd0a90d14 100644 --- a/libnd4j/include/ops/declarable/generic/bitwise/rshift.cpp +++ b/libnd4j/include/ops/declarable/generic/bitwise/rshift.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/bitwise/shift.cpp b/libnd4j/include/ops/declarable/generic/bitwise/shift.cpp index 7d0647e1b..bce70e516 100644 --- a/libnd4j/include/ops/declarable/generic/bitwise/shift.cpp +++ b/libnd4j/include/ops/declarable/generic/bitwise/shift.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/images/resize_area.cpp b/libnd4j/include/ops/declarable/generic/images/resize_area.cpp index 4ae03cc25..5eca1d1da 100644 --- a/libnd4j/include/ops/declarable/generic/images/resize_area.cpp +++ b/libnd4j/include/ops/declarable/generic/images/resize_area.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/images/resize_bicubic.cpp b/libnd4j/include/ops/declarable/generic/images/resize_bicubic.cpp index a867a2147..569c29616 100644 --- a/libnd4j/include/ops/declarable/generic/images/resize_bicubic.cpp +++ b/libnd4j/include/ops/declarable/generic/images/resize_bicubic.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/images/resize_linear.cpp b/libnd4j/include/ops/declarable/generic/images/resize_linear.cpp index 6d72bf889..624550f5c 100644 --- a/libnd4j/include/ops/declarable/generic/images/resize_linear.cpp +++ b/libnd4j/include/ops/declarable/generic/images/resize_linear.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/images/resize_neighbor.cpp b/libnd4j/include/ops/declarable/generic/images/resize_neighbor.cpp index 3454fb897..3b0c928b3 100644 --- a/libnd4j/include/ops/declarable/generic/images/resize_neighbor.cpp +++ b/libnd4j/include/ops/declarable/generic/images/resize_neighbor.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com diff --git a/libnd4j/include/ops/declarable/generic/linalg/digamma.cpp b/libnd4j/include/ops/declarable/generic/linalg/digamma.cpp index 17afcc10b..3074f54cc 100644 --- a/libnd4j/include/ops/declarable/generic/linalg/digamma.cpp +++ b/libnd4j/include/ops/declarable/generic/linalg/digamma.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/generic/linalg/lgamma.cpp b/libnd4j/include/ops/declarable/generic/linalg/lgamma.cpp index c39f8b55d..e1538702a 100644 --- a/libnd4j/include/ops/declarable/generic/linalg/lgamma.cpp +++ b/libnd4j/include/ops/declarable/generic/linalg/lgamma.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author George A. Shulinok diff --git a/libnd4j/include/ops/declarable/generic/linalg/lstsq.cpp b/libnd4j/include/ops/declarable/generic/linalg/lstsq.cpp index 5078ff6f1..f01120fad 100644 --- a/libnd4j/include/ops/declarable/generic/linalg/lstsq.cpp +++ b/libnd4j/include/ops/declarable/generic/linalg/lstsq.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by GS at 01/28/2020 diff --git a/libnd4j/include/ops/declarable/generic/linalg/qr.cpp b/libnd4j/include/ops/declarable/generic/linalg/qr.cpp index 1cdfc6884..d3bd489ea 100644 --- a/libnd4j/include/ops/declarable/generic/linalg/qr.cpp +++ b/libnd4j/include/ops/declarable/generic/linalg/qr.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by GS at 12/20/2019 diff --git a/libnd4j/include/ops/declarable/generic/linalg/solve.cpp b/libnd4j/include/ops/declarable/generic/linalg/solve.cpp index 154001684..ed5b8135d 100644 --- a/libnd4j/include/ops/declarable/generic/linalg/solve.cpp +++ b/libnd4j/include/ops/declarable/generic/linalg/solve.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by GS at 01/22/2020 diff --git a/libnd4j/include/ops/declarable/generic/linalg/triangular_solve.cpp b/libnd4j/include/ops/declarable/generic/linalg/triangular_solve.cpp index 49ec1e135..8163a337b 100644 --- a/libnd4j/include/ops/declarable/generic/linalg/triangular_solve.cpp +++ b/libnd4j/include/ops/declarable/generic/linalg/triangular_solve.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by GS at 01/14/2020 diff --git a/libnd4j/include/ops/declarable/generic/loss/meanPairWsSqErr.cpp b/libnd4j/include/ops/declarable/generic/loss/meanPairWsSqErr.cpp index e604a3da8..ba9e0fa22 100644 --- a/libnd4j/include/ops/declarable/generic/loss/meanPairWsSqErr.cpp +++ b/libnd4j/include/ops/declarable/generic/loss/meanPairWsSqErr.cpp @@ -1,21 +1,25 @@ #pragma clang diagnostic push #pragma ide diagnostic ignored "cert-err58-cpp" -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com), created on 24.11.2017 diff --git a/libnd4j/include/ops/declarable/generic/nn/batchnorm.cpp b/libnd4j/include/ops/declarable/generic/nn/batchnorm.cpp index 7018ae342..7ea2d3487 100644 --- a/libnd4j/include/ops/declarable/generic/nn/batchnorm.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/batchnorm.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com, created on 29/10/17. diff --git a/libnd4j/include/ops/declarable/generic/nn/convo/conv3d.cpp b/libnd4j/include/ops/declarable/generic/nn/convo/conv3d.cpp index 889a01b9a..573e824ab 100644 --- a/libnd4j/include/ops/declarable/generic/nn/convo/conv3d.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/convo/conv3d.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // diff --git a/libnd4j/include/ops/declarable/generic/nn/dot_product_attention.cpp b/libnd4j/include/ops/declarable/generic/nn/dot_product_attention.cpp index 49dc52a03..bee1c1efa 100644 --- a/libnd4j/include/ops/declarable/generic/nn/dot_product_attention.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/dot_product_attention.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Paul Dubs diff --git a/libnd4j/include/ops/declarable/generic/nn/fusedBatchNorm.cpp b/libnd4j/include/ops/declarable/generic/nn/fusedBatchNorm.cpp index 926ba49ea..0a7525a35 100644 --- a/libnd4j/include/ops/declarable/generic/nn/fusedBatchNorm.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/fusedBatchNorm.cpp @@ -156,7 +156,7 @@ namespace sd { return SHAPELIST(CONSTANT(outShapeInfo), CONSTANT(batchMeanShapeInfo), CONSTANT(batchVarShapeInfo)); } - + } } diff --git a/libnd4j/include/ops/declarable/generic/nn/layer_norm.cpp b/libnd4j/include/ops/declarable/generic/nn/layer_norm.cpp index 80c01bcd2..500aecac5 100644 --- a/libnd4j/include/ops/declarable/generic/nn/layer_norm.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/layer_norm.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Paul Dubs diff --git a/libnd4j/include/ops/declarable/generic/nn/multi_head_dot_product_attention.cpp b/libnd4j/include/ops/declarable/generic/nn/multi_head_dot_product_attention.cpp index 7ff8eb4c5..018c928e3 100644 --- a/libnd4j/include/ops/declarable/generic/nn/multi_head_dot_product_attention.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/multi_head_dot_product_attention.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Paul Dubs diff --git a/libnd4j/include/ops/declarable/generic/nn/recurrent/gru.cpp b/libnd4j/include/ops/declarable/generic/nn/recurrent/gru.cpp index 0be3c8393..cc1ae5d3e 100644 --- a/libnd4j/include/ops/declarable/generic/nn/recurrent/gru.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/recurrent/gru.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/generic/nn/recurrent/lstm.cpp b/libnd4j/include/ops/declarable/generic/nn/recurrent/lstm.cpp index 915be3129..91c7a70a9 100644 --- a/libnd4j/include/ops/declarable/generic/nn/recurrent/lstm.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/recurrent/lstm.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma, created on 15.02.2018 diff --git a/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmBlock.cpp b/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmBlock.cpp index 1fd7ec8cc..dafb3aac2 100644 --- a/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmBlock.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmBlock.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // lstmBlock: Full LSTM layer in one op // @author Alex Black diff --git a/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmBlockCell.cpp b/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmBlockCell.cpp index 55d3a6b7a..f967d0be4 100644 --- a/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmBlockCell.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmBlockCell.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Alex Black diff --git a/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmLayer.cpp b/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmLayer.cpp index 0a0754a8e..021289173 100644 --- a/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmLayer.cpp +++ b/libnd4j/include/ops/declarable/generic/nn/recurrent/lstmLayer.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/generic/random/multinomial.cpp b/libnd4j/include/ops/declarable/generic/random/multinomial.cpp index 2e8225d2c..1774bec71 100644 --- a/libnd4j/include/ops/declarable/generic/random/multinomial.cpp +++ b/libnd4j/include/ops/declarable/generic/random/multinomial.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/random/uniform.cpp b/libnd4j/include/ops/declarable/generic/random/uniform.cpp index d4abccf78..ffc02c204 100644 --- a/libnd4j/include/ops/declarable/generic/random/uniform.cpp +++ b/libnd4j/include/ops/declarable/generic/random/uniform.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver119 on 29/10/17. diff --git a/libnd4j/include/ops/declarable/generic/reduce/argamax.cpp b/libnd4j/include/ops/declarable/generic/reduce/argamax.cpp index a347c398a..b6a05639d 100644 --- a/libnd4j/include/ops/declarable/generic/reduce/argamax.cpp +++ b/libnd4j/include/ops/declarable/generic/reduce/argamax.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // Created by Abdelrauf 2020 (based on argmax) diff --git a/libnd4j/include/ops/declarable/generic/reduce/argamin.cpp b/libnd4j/include/ops/declarable/generic/reduce/argamin.cpp index 68ad9d2e5..174f87eed 100644 --- a/libnd4j/include/ops/declarable/generic/reduce/argamin.cpp +++ b/libnd4j/include/ops/declarable/generic/reduce/argamin.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // Created by Abdelrauf 2020 (based on argmax) diff --git a/libnd4j/include/ops/declarable/generic/reduce/argmax.cpp b/libnd4j/include/ops/declarable/generic/reduce/argmax.cpp index f8a2486fa..86f3b5ef6 100644 --- a/libnd4j/include/ops/declarable/generic/reduce/argmax.cpp +++ b/libnd4j/include/ops/declarable/generic/reduce/argmax.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver119 on 01.11.2017. diff --git a/libnd4j/include/ops/declarable/generic/tensor/strided_slice.cpp b/libnd4j/include/ops/declarable/generic/tensor/strided_slice.cpp index 9974197b8..2ca678327 100644 --- a/libnd4j/include/ops/declarable/generic/tensor/strided_slice.cpp +++ b/libnd4j/include/ops/declarable/generic/tensor/strided_slice.cpp @@ -1,17 +1,22 @@ -/* Copyright 2015 The TensorFlow Authors. All Rights Reserved. - -Licensed under the Apache License, Version 2.0 (the "License"); -you may not use this file except in compliance with the License. -You may obtain a copy of the License at - - http://www.apache.org/licenses/LICENSE-2.0 - -Unless required by applicable law or agreed to in writing, software -distributed under the License is distributed on an "AS IS" BASIS, -WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -See the License for the specific language governing permissions and -limitations under the License. -==============================================================================*/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ #include #if NOT_EXCLUDED(OP_strided_slice) diff --git a/libnd4j/include/ops/declarable/generic/transforms/standardize.cpp b/libnd4j/include/ops/declarable/generic/transforms/standardize.cpp index f4e8a6f7a..a4bb69442 100644 --- a/libnd4j/include/ops/declarable/generic/transforms/standardize.cpp +++ b/libnd4j/include/ops/declarable/generic/transforms/standardize.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Paul Dubs diff --git a/libnd4j/include/ops/declarable/generic/updaters/adaDeltaUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/adaDeltaUpdater.cpp index 93f01ae1f..5dbb04fa3 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/adaDeltaUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/adaDeltaUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/updaters/adaGradUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/adaGradUpdater.cpp index 4cd5b0504..6651a170c 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/adaGradUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/adaGradUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/updaters/adaMaxUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/adaMaxUpdater.cpp index 9f4bb574b..5ec9c87a3 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/adaMaxUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/adaMaxUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/updaters/adamUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/adamUpdater.cpp index 96386c45b..60c4ea3c6 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/adamUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/adamUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/updaters/amsGradUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/amsGradUpdater.cpp index 32084d970..1c6958954 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/amsGradUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/amsGradUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/updaters/nadamUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/nadamUpdater.cpp index 4d5e4e12e..37508984d 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/nadamUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/nadamUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/updaters/nesterovsUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/nesterovsUpdater.cpp index bcbefe36b..ff9c55f21 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/nesterovsUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/nesterovsUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/updaters/rmsPropUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/rmsPropUpdater.cpp index a611a4fbe..37ce3a301 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/rmsPropUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/rmsPropUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/generic/updaters/sgdUpdater.cpp b/libnd4j/include/ops/declarable/generic/updaters/sgdUpdater.cpp index 491d7b53e..b9f84086a 100644 --- a/libnd4j/include/ops/declarable/generic/updaters/sgdUpdater.cpp +++ b/libnd4j/include/ops/declarable/generic/updaters/sgdUpdater.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/headers/compression.h b/libnd4j/include/ops/declarable/headers/compression.h index 9c177f8a4..5d74be90d 100644 --- a/libnd4j/include/ops/declarable/headers/compression.h +++ b/libnd4j/include/ops/declarable/headers/compression.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com diff --git a/libnd4j/include/ops/declarable/headers/parity_ops.h b/libnd4j/include/ops/declarable/headers/parity_ops.h index 27c012214..b3363da9b 100644 --- a/libnd4j/include/ops/declarable/headers/parity_ops.h +++ b/libnd4j/include/ops/declarable/headers/parity_ops.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/headers/random.h b/libnd4j/include/ops/declarable/headers/random.h index 367a41995..e156dae66 100644 --- a/libnd4j/include/ops/declarable/headers/random.h +++ b/libnd4j/include/ops/declarable/headers/random.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/headers/updaters.h b/libnd4j/include/ops/declarable/headers/updaters.h index dc08ff1f2..d8028821e 100644 --- a/libnd4j/include/ops/declarable/headers/updaters.h +++ b/libnd4j/include/ops/declarable/headers/updaters.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/compression.h b/libnd4j/include/ops/declarable/helpers/compression.h index b9c70a91b..ea5709355 100644 --- a/libnd4j/include/ops/declarable/helpers/compression.h +++ b/libnd4j/include/ops/declarable/helpers/compression.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ - +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com // diff --git a/libnd4j/include/ops/declarable/helpers/cpu/addBias.cpp b/libnd4j/include/ops/declarable/helpers/cpu/addBias.cpp index aa86ea041..1210c1b01 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/addBias.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/addBias.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma, created on 26.02.2018 diff --git a/libnd4j/include/ops/declarable/helpers/cpu/compilation_units/crop_and_resize.cpp.in b/libnd4j/include/ops/declarable/helpers/cpu/compilation_units/crop_and_resize.cpp.in index b0cdafebd..a9ca7a198 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/compilation_units/crop_and_resize.cpp.in +++ b/libnd4j/include/ops/declarable/helpers/cpu/compilation_units/crop_and_resize.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com diff --git a/libnd4j/include/ops/declarable/helpers/cpu/compression/compression.cpp b/libnd4j/include/ops/declarable/helpers/cpu/compression/compression.cpp index 0911b0619..9347ced4a 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/compression/compression.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/compression/compression.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ - +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com // diff --git a/libnd4j/include/ops/declarable/helpers/cpu/crop_and_resize.cpp b/libnd4j/include/ops/declarable/helpers/cpu/crop_and_resize.cpp index ab6503946..2d3105b61 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/crop_and_resize.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/crop_and_resize.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ /* Copyright 2016 The TensorFlow Authors. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the "License"); diff --git a/libnd4j/include/ops/declarable/helpers/cpu/diGamma.cpp b/libnd4j/include/ops/declarable/helpers/cpu/diGamma.cpp index 37abaf559..8f4c8bf61 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/diGamma.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/diGamma.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/dilation2d.cpp b/libnd4j/include/ops/declarable/helpers/cpu/dilation2d.cpp index 1688dcbc4..9a870d411 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/dilation2d.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/dilation2d.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyrigkht (c) 2015-2018 Skymind, Inc. - * - * Tkhis program and tkhe accompanying materials are made available under tkhe - * terms of tkhe Apackhe License, Version 2.0 wkhickh is available at - * khttps://www.apackhe.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under tkhe License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, eitkher express or implied. See tkhe - * License for tkhe specific language governing permissions and limitations - * under tkhe License. - * - * SPDX-License-Identifier: Apackhe-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @autkhor raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/helpers/cpu/image_resize.cpp b/libnd4j/include/ops/declarable/helpers/cpu/image_resize.cpp index 7206b03e5..d13550821 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/image_resize.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/image_resize.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ /* Copyright 2016 The TensorFlow Authors. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the "License"); diff --git a/libnd4j/include/ops/declarable/helpers/cpu/lgamma.cpp b/libnd4j/include/ops/declarable/helpers/cpu/lgamma.cpp index 3b71f7ce9..ebbe6a85c 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/lgamma.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/lgamma.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author George A. Shulinok diff --git a/libnd4j/include/ops/declarable/helpers/cpu/lstm.cpp b/libnd4j/include/ops/declarable/helpers/cpu/lstm.cpp index 02d4c9855..b62431625 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/lstm.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/lstm.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma, created on 14.02.2018 diff --git a/libnd4j/include/ops/declarable/helpers/cpu/lstsq.cpp b/libnd4j/include/ops/declarable/helpers/cpu/lstsq.cpp index 204b05530..0f6fe05ec 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/lstsq.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/lstsq.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/cpu/merge.cpp b/libnd4j/include/ops/declarable/helpers/cpu/merge.cpp index 2a0c5af95..fdf4874b1 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/merge.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/merge.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com), created on 20.04.2018 diff --git a/libnd4j/include/ops/declarable/helpers/cpu/qr.cpp b/libnd4j/include/ops/declarable/helpers/cpu/qr.cpp index 1f980e553..7789700e9 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/qr.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/qr.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author George A. Shulinok diff --git a/libnd4j/include/ops/declarable/helpers/cpu/segment.cpp b/libnd4j/include/ops/declarable/helpers/cpu/segment.cpp index 50ff79679..0b3ab7847 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/segment.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/segment.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/cpu/shift.cpp b/libnd4j/include/ops/declarable/helpers/cpu/shift.cpp index 9dfeac2ec..f7ca523ef 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/shift.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/shift.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/helpers/cpu/solve.cpp b/libnd4j/include/ops/declarable/helpers/cpu/solve.cpp index a0034bb5d..f1fd57e3d 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/solve.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/solve.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/cpu/split.cpp b/libnd4j/include/ops/declarable/helpers/cpu/split.cpp index 48c6c4903..4823fbd38 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/split.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/split.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/triangular_solve.cpp b/libnd4j/include/ops/declarable/helpers/cpu/triangular_solve.cpp index 86847da16..cff33e75b 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/triangular_solve.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/triangular_solve.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaDelta.cpp b/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaDelta.cpp index 78268b2dc..496b5e75e 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaDelta.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaDelta.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaGrad.cpp b/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaGrad.cpp index e65f34e72..0ecd474eb 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaGrad.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaGrad.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaMax.cpp b/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaMax.cpp index 6c7d0d322..b217e74b6 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaMax.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/updaterAdaMax.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/updaterAdam.cpp b/libnd4j/include/ops/declarable/helpers/cpu/updaterAdam.cpp index 2d670949f..e8d91c3e6 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/updaterAdam.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/updaterAdam.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/updaterAmsGrad.cpp b/libnd4j/include/ops/declarable/helpers/cpu/updaterAmsGrad.cpp index 7cb05075c..74cf0065b 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/updaterAmsGrad.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/updaterAmsGrad.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/updaterNadam.cpp b/libnd4j/include/ops/declarable/helpers/cpu/updaterNadam.cpp index 40f9c9407..78167e56c 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/updaterNadam.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/updaterNadam.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/updaterNesterovs.cpp b/libnd4j/include/ops/declarable/helpers/cpu/updaterNesterovs.cpp index 1d8bb8d45..37211fa93 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/updaterNesterovs.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/updaterNesterovs.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cpu/updaterRmsProp.cpp b/libnd4j/include/ops/declarable/helpers/cpu/updaterRmsProp.cpp index 473b43cf8..89f3379a5 100644 --- a/libnd4j/include/ops/declarable/helpers/cpu/updaterRmsProp.cpp +++ b/libnd4j/include/ops/declarable/helpers/cpu/updaterRmsProp.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/betaInc.cu b/libnd4j/include/ops/declarable/helpers/cuda/betaInc.cu index a18ec1fda..ab1dfce9a 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/betaInc.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/betaInc.cu @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (t2) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/compression/compression.cu b/libnd4j/include/ops/declarable/helpers/cuda/compression/compression.cu index 5de20c57f..b0feae84b 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/compression/compression.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/compression/compression.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ - +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com // diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_col2vol.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_col2vol.cu index 80df76c91..0d790eca0 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_col2vol.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_col2vol.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_conv2d.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_conv2d.cu index 494ce4a81..f443a924e 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_conv2d.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_conv2d.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_conv2dBP.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_conv2dBP.cu index dbf4ee390..dff0a2626 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_conv2dBP.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_conv2dBP.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_depthwiseConv2d.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_depthwiseConv2d.cu index bbf5d5892..0205db4c3 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_depthwiseConv2d.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_depthwiseConv2d.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_depthwiseConv2dBP.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_depthwiseConv2dBP.cu index b06af6166..1947e2919 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_depthwiseConv2dBP.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_depthwiseConv2dBP.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling2d.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling2d.cu index cb7052d4b..bea1f8a4b 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling2d.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling2d.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling2dBP.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling2dBP.cu index 6ed62c6d4..90174a557 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling2dBP.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling2dBP.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling3d.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling3d.cu index 0a3bfc9b6..a9dd3296e 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling3d.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling3d.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling3dBP.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling3dBP.cu index fd78bb80b..e5ba8d09f 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling3dBP.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_pooling3dBP.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_sconv2d.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_sconv2d.cu index 3a9ed5364..372bf5636 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_sconv2d.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_sconv2d.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling2d.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling2d.cu index ee1fa8924..facf24e72 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling2d.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling2d.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling2dBP.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling2dBP.cu index c6864c48a..72e90f64a 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling2dBP.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling2dBP.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling3d.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling3d.cu index 1acb4307f..53ca189f5 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling3d.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling3d.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling3dBP.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling3dBP.cu index 5a1e08c07..e013c2df9 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling3dBP.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_upsampling3dBP.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_vol2col.cu b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_vol2col.cu index c2c5fb3ef..87f47a90d 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/convolutions_vol2col.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/convolutions_vol2col.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/diGamma.cu b/libnd4j/include/ops/declarable/helpers/cuda/diGamma.cu index ff217bdb6..18af21e34 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/diGamma.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/diGamma.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/image_resize.cu b/libnd4j/include/ops/declarable/helpers/cuda/image_resize.cu index 3365d5d62..fb7ca52ae 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/image_resize.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/image_resize.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ /* Copyright 2016 The TensorFlow Authors. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the "License"); diff --git a/libnd4j/include/ops/declarable/helpers/cuda/lgamma.cu b/libnd4j/include/ops/declarable/helpers/cuda/lgamma.cu index 2de455d0f..7667c1c29 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/lgamma.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/lgamma.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author George A. Shulinok diff --git a/libnd4j/include/ops/declarable/helpers/cuda/lstsq.cu b/libnd4j/include/ops/declarable/helpers/cuda/lstsq.cu index b28efff80..b10bea439 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/lstsq.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/lstsq.cu @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/cuda/shift.cu b/libnd4j/include/ops/declarable/helpers/cuda/shift.cu index c69285ef2..206e1df46 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/shift.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/shift.cu @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/helpers/cuda/solve.cu b/libnd4j/include/ops/declarable/helpers/cuda/solve.cu index 43ef78c3e..324fe6a76 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/solve.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/solve.cu @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/cuda/split.cu b/libnd4j/include/ops/declarable/helpers/cuda/split.cu index 19c58b89e..f754866a7 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/split.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/split.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/cuda/transforms.cu b/libnd4j/include/ops/declarable/helpers/cuda/transforms.cu index f1b57f52c..b285e8cb8 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/transforms.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/transforms.cu @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com), created on 20.04.2018 diff --git a/libnd4j/include/ops/declarable/helpers/cuda/triangular_solve.cu b/libnd4j/include/ops/declarable/helpers/cuda/triangular_solve.cu index e77bb4e19..68d062441 100644 --- a/libnd4j/include/ops/declarable/helpers/cuda/triangular_solve.cu +++ b/libnd4j/include/ops/declarable/helpers/cuda/triangular_solve.cu @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2020 Konduit, K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/gammaMathFunc.h b/libnd4j/include/ops/declarable/helpers/gammaMathFunc.h index 2f99f3777..2e3f8b3f8 100644 --- a/libnd4j/include/ops/declarable/helpers/gammaMathFunc.h +++ b/libnd4j/include/ops/declarable/helpers/gammaMathFunc.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/helpers/image_resize.h b/libnd4j/include/ops/declarable/helpers/image_resize.h index bd9e10b58..3dfa09247 100644 --- a/libnd4j/include/ops/declarable/helpers/image_resize.h +++ b/libnd4j/include/ops/declarable/helpers/image_resize.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author sgazeos@gmail.com diff --git a/libnd4j/include/ops/declarable/helpers/impl/gru.cpp b/libnd4j/include/ops/declarable/helpers/impl/gru.cpp index 277188428..be16a0669 100644 --- a/libnd4j/include/ops/declarable/helpers/impl/gru.cpp +++ b/libnd4j/include/ops/declarable/helpers/impl/gru.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2020 Konduit K.K. - * - * ThnIn program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which nIn available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * dnIntributed under the License nIn dnIntributed on an "AS nIn" BASnIn, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permnInsions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com), created on 15.02.2018, Alex Black diff --git a/libnd4j/include/ops/declarable/helpers/impl/lstm.cpp b/libnd4j/include/ops/declarable/helpers/impl/lstm.cpp index 4ab585e26..7cb395c65 100644 --- a/libnd4j/include/ops/declarable/helpers/impl/lstm.cpp +++ b/libnd4j/include/ops/declarable/helpers/impl/lstm.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma, created on 14.02.2018 diff --git a/libnd4j/include/ops/declarable/helpers/impl/sparse_to_dense.cpp b/libnd4j/include/ops/declarable/helpers/impl/sparse_to_dense.cpp index 4baa36d65..d10844ebb 100644 --- a/libnd4j/include/ops/declarable/helpers/impl/sparse_to_dense.cpp +++ b/libnd4j/include/ops/declarable/helpers/impl/sparse_to_dense.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/helpers/lgamma.h b/libnd4j/include/ops/declarable/helpers/lgamma.h index 184e33556..2f71fa5a1 100644 --- a/libnd4j/include/ops/declarable/helpers/lgamma.h +++ b/libnd4j/include/ops/declarable/helpers/lgamma.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author George A. Shulinok diff --git a/libnd4j/include/ops/declarable/helpers/lstmBlock.h b/libnd4j/include/ops/declarable/helpers/lstmBlock.h index 7df9bb795..e7ba221d6 100644 --- a/libnd4j/include/ops/declarable/helpers/lstmBlock.h +++ b/libnd4j/include/ops/declarable/helpers/lstmBlock.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma, created on 14.02.2018 diff --git a/libnd4j/include/ops/declarable/helpers/lstsq.h b/libnd4j/include/ops/declarable/helpers/lstsq.h index 9cc629383..cee8e47be 100644 --- a/libnd4j/include/ops/declarable/helpers/lstsq.h +++ b/libnd4j/include/ops/declarable/helpers/lstsq.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/shift.h b/libnd4j/include/ops/declarable/helpers/shift.h index f1b21741c..0c3e8f033 100644 --- a/libnd4j/include/ops/declarable/helpers/shift.h +++ b/libnd4j/include/ops/declarable/helpers/shift.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/helpers/solve.h b/libnd4j/include/ops/declarable/helpers/solve.h index 17234f313..a138029ae 100644 --- a/libnd4j/include/ops/declarable/helpers/solve.h +++ b/libnd4j/include/ops/declarable/helpers/solve.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/sparse_to_dense.h b/libnd4j/include/ops/declarable/helpers/sparse_to_dense.h index 541621257..3df845141 100644 --- a/libnd4j/include/ops/declarable/helpers/sparse_to_dense.h +++ b/libnd4j/include/ops/declarable/helpers/sparse_to_dense.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/declarable/helpers/triangular_solve.h b/libnd4j/include/ops/declarable/helpers/triangular_solve.h index 94e0198af..d0d099998 100644 --- a/libnd4j/include/ops/declarable/helpers/triangular_solve.h +++ b/libnd4j/include/ops/declarable/helpers/triangular_solve.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author GS diff --git a/libnd4j/include/ops/declarable/helpers/updatersHelpers.h b/libnd4j/include/ops/declarable/helpers/updatersHelpers.h index 5bd89b487..2bc6d7d12 100644 --- a/libnd4j/include/ops/declarable/helpers/updatersHelpers.h +++ b/libnd4j/include/ops/declarable/helpers/updatersHelpers.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleh Semeniv (oleg.semeniv@gmail.com) diff --git a/libnd4j/include/ops/declarable/platform/armcompute/armcomputeUtils.cpp b/libnd4j/include/ops/declarable/platform/armcompute/armcomputeUtils.cpp index 64d254167..4d7d5f41e 100644 --- a/libnd4j/include/ops/declarable/platform/armcompute/armcomputeUtils.cpp +++ b/libnd4j/include/ops/declarable/platform/armcompute/armcomputeUtils.cpp @@ -1,17 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // Created by Abdelrauf 2020 diff --git a/libnd4j/include/ops/declarable/platform/armcompute/armcomputeUtils.h b/libnd4j/include/ops/declarable/platform/armcompute/armcomputeUtils.h index b95d68b1e..9ef31cf1f 100644 --- a/libnd4j/include/ops/declarable/platform/armcompute/armcomputeUtils.h +++ b/libnd4j/include/ops/declarable/platform/armcompute/armcomputeUtils.h @@ -1,17 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // Created by Abdelrauf 2020 diff --git a/libnd4j/include/ops/declarable/platform/armcompute/avgpooling2d.cpp b/libnd4j/include/ops/declarable/platform/armcompute/avgpooling2d.cpp index 6c43a1ce2..21adaf9be 100644 --- a/libnd4j/include/ops/declarable/platform/armcompute/avgpooling2d.cpp +++ b/libnd4j/include/ops/declarable/platform/armcompute/avgpooling2d.cpp @@ -1,17 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // Created by Abdelrauf (rauf@konduit.ai) 2020 diff --git a/libnd4j/include/ops/declarable/platform/armcompute/conv2d.cpp b/libnd4j/include/ops/declarable/platform/armcompute/conv2d.cpp index d361ce4e1..4b6c89fea 100644 --- a/libnd4j/include/ops/declarable/platform/armcompute/conv2d.cpp +++ b/libnd4j/include/ops/declarable/platform/armcompute/conv2d.cpp @@ -1,17 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // Created by Abdelrauf (rauf@konduit.ai) 2020 diff --git a/libnd4j/include/ops/declarable/platform/armcompute/deconv2d.cpp b/libnd4j/include/ops/declarable/platform/armcompute/deconv2d.cpp index 06742983d..481060073 100644 --- a/libnd4j/include/ops/declarable/platform/armcompute/deconv2d.cpp +++ b/libnd4j/include/ops/declarable/platform/armcompute/deconv2d.cpp @@ -1,17 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // Created by Abdelrauf (rauf@konduit.ai) 2020 diff --git a/libnd4j/include/ops/declarable/platform/armcompute/maxpooling2d.cpp b/libnd4j/include/ops/declarable/platform/armcompute/maxpooling2d.cpp index f06a0441b..09634efff 100644 --- a/libnd4j/include/ops/declarable/platform/armcompute/maxpooling2d.cpp +++ b/libnd4j/include/ops/declarable/platform/armcompute/maxpooling2d.cpp @@ -1,17 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019 Konduit K.K. - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // Created by Abdelrauf 2020 diff --git a/libnd4j/include/ops/declarable/platform/mkldnn/batchnorm.cpp b/libnd4j/include/ops/declarable/platform/mkldnn/batchnorm.cpp index 6e0b1685a..3d2d266c7 100644 --- a/libnd4j/include/ops/declarable/platform/mkldnn/batchnorm.cpp +++ b/libnd4j/include/ops/declarable/platform/mkldnn/batchnorm.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author saudet diff --git a/libnd4j/include/ops/declarable/platform/mkldnn/depthwiseConv2d.cpp b/libnd4j/include/ops/declarable/platform/mkldnn/depthwiseConv2d.cpp index 938494d5a..7706b2b07 100644 --- a/libnd4j/include/ops/declarable/platform/mkldnn/depthwiseConv2d.cpp +++ b/libnd4j/include/ops/declarable/platform/mkldnn/depthwiseConv2d.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Yurii Shyrma (iuriish@yahoo.com) diff --git a/libnd4j/include/ops/declarable/platform/mkldnn/softmax.cpp b/libnd4j/include/ops/declarable/platform/mkldnn/softmax.cpp index 9935fd50f..8d30030e2 100644 --- a/libnd4j/include/ops/declarable/platform/mkldnn/softmax.cpp +++ b/libnd4j/include/ops/declarable/platform/mkldnn/softmax.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleg Semeniv diff --git a/libnd4j/include/ops/declarable/platform/mkldnn/tanh.cpp b/libnd4j/include/ops/declarable/platform/mkldnn/tanh.cpp index a808239de..3d1fdbb62 100644 --- a/libnd4j/include/ops/declarable/platform/mkldnn/tanh.cpp +++ b/libnd4j/include/ops/declarable/platform/mkldnn/tanh.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleg Semeniv diff --git a/libnd4j/include/ops/declarable/platform/mkldnn/xw_plus_b.cpp b/libnd4j/include/ops/declarable/platform/mkldnn/xw_plus_b.cpp index 1097ccd34..e00450f24 100644 --- a/libnd4j/include/ops/declarable/platform/mkldnn/xw_plus_b.cpp +++ b/libnd4j/include/ops/declarable/platform/mkldnn/xw_plus_b.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author Oleg Semeniv diff --git a/libnd4j/include/ops/impl/compilation_units/specials_double.cpp.in b/libnd4j/include/ops/impl/compilation_units/specials_double.cpp.in index 00e0883f7..924e4c662 100644 --- a/libnd4j/include/ops/impl/compilation_units/specials_double.cpp.in +++ b/libnd4j/include/ops/impl/compilation_units/specials_double.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/ops/impl/compilation_units/specials_single.cpp.in b/libnd4j/include/ops/impl/compilation_units/specials_single.cpp.in index 49110d829..b537293d1 100644 --- a/libnd4j/include/ops/impl/compilation_units/specials_single.cpp.in +++ b/libnd4j/include/ops/impl/compilation_units/specials_single.cpp.in @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/include/system/nd4jmalloc.h b/libnd4j/include/system/nd4jmalloc.h index c808aad09..ec4f60501 100644 --- a/libnd4j/include/system/nd4jmalloc.h +++ b/libnd4j/include/system/nd4jmalloc.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by Administrator on 3/6/2016. diff --git a/libnd4j/include/system/nd4jmemset.h b/libnd4j/include/system/nd4jmemset.h index 6482dcb8a..b02837a6f 100644 --- a/libnd4j/include/system/nd4jmemset.h +++ b/libnd4j/include/system/nd4jmemset.h @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by Administrator on 3/6/2016. diff --git a/libnd4j/include/system/pairwise_util.h b/libnd4j/include/system/pairwise_util.h index d9e0965c8..65ff05286 100755 --- a/libnd4j/include/system/pairwise_util.h +++ b/libnd4j/include/system/pairwise_util.h @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by agibsonccc on 2/15/16. diff --git a/libnd4j/tests_cpu/layers_tests/AttentionTests.cpp b/libnd4j/tests_cpu/layers_tests/AttentionTests.cpp index ab6d50b53..74a7e5e6b 100644 --- a/libnd4j/tests_cpu/layers_tests/AttentionTests.cpp +++ b/libnd4j/tests_cpu/layers_tests/AttentionTests.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests10.cpp b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests10.cpp index 2ffc2c22d..01403e968 100644 --- a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests10.cpp +++ b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests10.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // diff --git a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests13.cpp b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests13.cpp index 639d90389..713548a0e 100644 --- a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests13.cpp +++ b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests13.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // diff --git a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests16.cpp b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests16.cpp index cbec08c0c..dbdc87079 100644 --- a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests16.cpp +++ b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests16.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // diff --git a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests17.cpp b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests17.cpp index 1341312f8..be157ac40 100644 --- a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests17.cpp +++ b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests17.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // diff --git a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests18.cpp b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests18.cpp index 1f36a8f2c..14d37cbf9 100644 --- a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests18.cpp +++ b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests18.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // diff --git a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests19.cpp b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests19.cpp index beccc1aae..0f2c73aa8 100644 --- a/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests19.cpp +++ b/libnd4j/tests_cpu/layers_tests/DeclarableOpsTests19.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // diff --git a/libnd4j/tests_cpu/layers_tests/FlatUtilsTests.cpp b/libnd4j/tests_cpu/layers_tests/FlatUtilsTests.cpp index f31a1c7ec..327a4e3c3 100644 --- a/libnd4j/tests_cpu/layers_tests/FlatUtilsTests.cpp +++ b/libnd4j/tests_cpu/layers_tests/FlatUtilsTests.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/tests_cpu/layers_tests/HelpersTests1.cpp b/libnd4j/tests_cpu/layers_tests/HelpersTests1.cpp index fae8c4918..e8be972ee 100644 --- a/libnd4j/tests_cpu/layers_tests/HelpersTests1.cpp +++ b/libnd4j/tests_cpu/layers_tests/HelpersTests1.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ #include "testlayers.h" #include diff --git a/libnd4j/tests_cpu/layers_tests/HelpersTests2.cpp b/libnd4j/tests_cpu/layers_tests/HelpersTests2.cpp index 8a0cc28bf..110617937 100644 --- a/libnd4j/tests_cpu/layers_tests/HelpersTests2.cpp +++ b/libnd4j/tests_cpu/layers_tests/HelpersTests2.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ #include "testlayers.h" #include diff --git a/libnd4j/tests_cpu/layers_tests/MultiDeviceTests.cpp b/libnd4j/tests_cpu/layers_tests/MultiDeviceTests.cpp index 1c12f2d72..3ea90eb27 100644 --- a/libnd4j/tests_cpu/layers_tests/MultiDeviceTests.cpp +++ b/libnd4j/tests_cpu/layers_tests/MultiDeviceTests.cpp @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2019 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/tests_cpu/layers_tests/PlaygroundTests.cpp b/libnd4j/tests_cpu/layers_tests/PlaygroundTests.cpp index 97dd9f13c..f476fb05a 100644 --- a/libnd4j/tests_cpu/layers_tests/PlaygroundTests.cpp +++ b/libnd4j/tests_cpu/layers_tests/PlaygroundTests.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // Created by raver119 on 20.11.17. diff --git a/libnd4j/tests_cpu/layers_tests/RNGTests.cpp b/libnd4j/tests_cpu/layers_tests/RNGTests.cpp index dfc23f559..8da3542f6 100644 --- a/libnd4j/tests_cpu/layers_tests/RNGTests.cpp +++ b/libnd4j/tests_cpu/layers_tests/RNGTests.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/libnd4j/tests_cpu/layers_tests/StringTests.cpp b/libnd4j/tests_cpu/layers_tests/StringTests.cpp index 41352246e..156831456 100644 --- a/libnd4j/tests_cpu/layers_tests/StringTests.cpp +++ b/libnd4j/tests_cpu/layers_tests/StringTests.cpp @@ -1,19 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * Copyright (c) 2019-2020 Konduit K.K. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ // // @author raver119@gmail.com diff --git a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/MapperNamespace.java b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/MapperNamespace.java index 0f1f7d837..51c920e2e 100644 --- a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/MapperNamespace.java +++ b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/MapperNamespace.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + // Generated by the protocol buffer compiler. DO NOT EDIT! // source: mapper.proto diff --git a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/OpNamespace.java b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/OpNamespace.java index a7e826739..a9fabbd74 100644 --- a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/OpNamespace.java +++ b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/OpNamespace.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + // Generated by the protocol buffer compiler. DO NOT EDIT! // source: op.proto diff --git a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/TensorNamespace.java b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/TensorNamespace.java index 434bda3a8..33fb7e045 100644 --- a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/TensorNamespace.java +++ b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/ir/TensorNamespace.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + // Generated by the protocol buffer compiler. DO NOT EDIT! // source: tensor.proto diff --git a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/resources/META-INF/services/org.nd4j.linalg.env.EnvironmentalAction b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/resources/META-INF/services/org.nd4j.linalg.env.EnvironmentalAction index 30a14bbba..d70a36a21 100644 --- a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/resources/META-INF/services/org.nd4j.linalg.env.EnvironmentalAction +++ b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/resources/META-INF/services/org.nd4j.linalg.env.EnvironmentalAction @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.common.base.PreconditionsFormat b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.common.base.PreconditionsFormat index ff268a3a9..df6002802 100644 --- a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.common.base.PreconditionsFormat +++ b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.common.base.PreconditionsFormat @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor index 8b4061419..4cba29246 100644 --- a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor +++ b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.systeminfo.GPUInfoProvider b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.systeminfo.GPUInfoProvider index 43bee5413..ec4ae9754 100644 --- a/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.systeminfo.GPUInfoProvider +++ b/nd4j/nd4j-backends/nd4j-api-parent/nd4j-native-api/src/main/resources/META-INF/services/org.nd4j.systeminfo.GPUInfoProvider @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/pom.xml b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/pom.xml index 3ac2c29d9..2c592d645 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/pom.xml +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/pom.xml @@ -76,16 +76,7 @@ ${cuda.version}-${cudnn.version}-${javacpp-presets.cuda.version} ${dependency.platform} - + junit junit diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/Allocator.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/Allocator.java index df45f85e5..bd51f064a 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/Allocator.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/Allocator.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/AtomicState.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/AtomicState.java index 1925197a9..829dbd798 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/AtomicState.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/AtomicState.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.concurrency; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/DeviceAllocationsTracker.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/DeviceAllocationsTracker.java index 7ecd22de4..ca2c54529 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/DeviceAllocationsTracker.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/DeviceAllocationsTracker.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.concurrency; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/Lock.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/Lock.java index 36ea29962..585c5144a 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/Lock.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/Lock.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.concurrency; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/RRWLock.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/RRWLock.java index a946ffe27..35aa72207 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/RRWLock.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/concurrency/RRWLock.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.concurrency; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/AccessState.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/AccessState.java index 021e52c4d..cbee85b1e 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/AccessState.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/AccessState.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.enums; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/Aggressiveness.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/Aggressiveness.java index 69fba654f..de834ca40 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/Aggressiveness.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/Aggressiveness.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.enums; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/AllocationStatus.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/AllocationStatus.java index 82aca0d42..e848e55d0 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/AllocationStatus.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/AllocationStatus.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.enums; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/CudaConstants.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/CudaConstants.java index 19965a92e..7e02b4fe1 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/CudaConstants.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/CudaConstants.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.enums; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/SyncState.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/SyncState.java index 428da38dc..a39e00465 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/SyncState.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/enums/SyncState.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.enums; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/garbage/GarbageBufferReference.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/garbage/GarbageBufferReference.java index 91a163794..5374b5c4f 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/garbage/GarbageBufferReference.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/garbage/GarbageBufferReference.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.garbage; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/garbage/GarbageResourceReference.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/garbage/GarbageResourceReference.java index fd566a73d..dcb14380b 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/garbage/GarbageResourceReference.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/garbage/GarbageResourceReference.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.garbage; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AllocationPoint.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AllocationPoint.java index 9ddab88d5..1b702eff1 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AllocationPoint.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AllocationPoint.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AllocationShape.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AllocationShape.java index 2d12522e0..0a1e802a1 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AllocationShape.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AllocationShape.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AtomicAllocator.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AtomicAllocator.java index 46964c8f4..8b95febe7 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AtomicAllocator.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/AtomicAllocator.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/NestedPoint.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/NestedPoint.java index e423279ff..118c3628b 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/NestedPoint.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/impl/NestedPoint.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/CudaPointer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/CudaPointer.java index 8a0620cc0..73fca7aab 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/CudaPointer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/CudaPointer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.pointers; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/PointersPair.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/PointersPair.java index bd958e224..00084cf1b 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/PointersPair.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/PointersPair.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.pointers; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/CUcontext.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/CUcontext.java index 0fe2ed5aa..6a6bff9b0 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/CUcontext.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/CUcontext.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.pointers.cuda; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cublasHandle_t.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cublasHandle_t.java index 305ef8d22..379d71a66 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cublasHandle_t.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cublasHandle_t.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.pointers.cuda; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cudaEvent_t.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cudaEvent_t.java index de1920f0a..46a459704 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cudaEvent_t.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cudaEvent_t.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.pointers.cuda; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cudaStream_t.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cudaStream_t.java index 7d9bfb629..34216a7e1 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cudaStream_t.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cudaStream_t.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.pointers.cuda; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cusolverDnHandle_t.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cusolverDnHandle_t.java index f3e8fc740..e11a7d059 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cusolverDnHandle_t.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/pointers/cuda/cusolverDnHandle_t.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.pointers.cuda; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/tad/BasicTADManager.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/tad/BasicTADManager.java index 981ffecdd..bc6a9d7f9 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/tad/BasicTADManager.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/tad/BasicTADManager.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.tad; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/tad/DeviceTADManager.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/tad/DeviceTADManager.java index 4ad3e92aa..133a2044e 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/tad/DeviceTADManager.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/tad/DeviceTADManager.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.tad; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/RateTimer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/RateTimer.java index a465e8101..5866cd333 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/RateTimer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/RateTimer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/Ring.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/Ring.java index cc8896e5c..64e8b177a 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/Ring.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/Ring.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/TimeProvider.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/TimeProvider.java index 1186f0cbe..53f13955a 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/TimeProvider.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/TimeProvider.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/impl/BinaryTimer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/impl/BinaryTimer.java index 7b940464c..f71b46bf8 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/impl/BinaryTimer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/impl/BinaryTimer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/impl/SimpleTimer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/impl/SimpleTimer.java index ec48ba4fc..1089ffdff 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/impl/SimpleTimer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/impl/SimpleTimer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/MillisecondsProvider.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/MillisecondsProvider.java index 988335e09..471da179a 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/MillisecondsProvider.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/MillisecondsProvider.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time.providers; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/NanosecondsProvider.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/NanosecondsProvider.java index 742b5239d..cdd010673 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/NanosecondsProvider.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/NanosecondsProvider.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time.providers; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/OperativeProvider.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/OperativeProvider.java index 7d40bf247..001796fae 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/OperativeProvider.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/providers/OperativeProvider.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time.providers; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/rings/LockedRing.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/rings/LockedRing.java index 7b8b90381..86e6195f9 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/rings/LockedRing.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/time/rings/LockedRing.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.time.rings; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/utils/AllocationUtils.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/utils/AllocationUtils.java index 40eb8e4a8..e8f137506 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/utils/AllocationUtils.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/allocator/utils/AllocationUtils.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.allocator.utils; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/balance/Balancer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/balance/Balancer.java index 8fceed28e..80b4c7fb7 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/balance/Balancer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/balance/Balancer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.balance; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/concurrency/CudaAffinityManager.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/concurrency/CudaAffinityManager.java index 415fa487f..15f2cdba6 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/concurrency/CudaAffinityManager.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/concurrency/CudaAffinityManager.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.concurrency; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/concurrency/EventsProvider.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/concurrency/EventsProvider.java index 7cc3e6838..4e5ee96a8 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/concurrency/EventsProvider.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/concurrency/EventsProvider.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.concurrency; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/Configuration.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/Configuration.java index 49582f002..2b5b25dd9 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/Configuration.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/Configuration.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.conf; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/CudaEnvironment.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/CudaEnvironment.java index 28523fac7..8e324629b 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/CudaEnvironment.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/CudaEnvironment.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.conf; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/DeviceInformation.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/DeviceInformation.java index 450770262..74e388c81 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/DeviceInformation.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/conf/DeviceInformation.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.conf; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ConstantProtector.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ConstantProtector.java index 388f49136..b6c5dd03d 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ConstantProtector.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ConstantProtector.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.constant; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/CudaConstantHandler.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/CudaConstantHandler.java index fc77ef349..1a616a3bb 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/CudaConstantHandler.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/CudaConstantHandler.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.constant; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ProtectedCudaConstantHandler.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ProtectedCudaConstantHandler.java index b2e2a0361..e5be68a75 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ProtectedCudaConstantHandler.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ProtectedCudaConstantHandler.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.constant; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ProtectedCudaShapeInfoProvider.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ProtectedCudaShapeInfoProvider.java index 13b26f363..b232f67d6 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ProtectedCudaShapeInfoProvider.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/constant/ProtectedCudaShapeInfoProvider.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.constant; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/FlowController.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/FlowController.java index f79ba3385..9abde7cf9 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/FlowController.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/FlowController.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.flow; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/impl/GridFlowController.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/impl/GridFlowController.java index df93aa314..7dd2291e5 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/impl/GridFlowController.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/impl/GridFlowController.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.flow.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/impl/SynchronousFlowController.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/impl/SynchronousFlowController.java index 030ccad30..a97d836ed 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/impl/SynchronousFlowController.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/flow/impl/SynchronousFlowController.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.flow.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/handler/MemoryHandler.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/handler/MemoryHandler.java index 44d8e2042..2a82f06e5 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/handler/MemoryHandler.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/handler/MemoryHandler.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.handler; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/handler/impl/CudaZeroHandler.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/handler/impl/CudaZeroHandler.java index 4056338a9..feb648962 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/handler/impl/CudaZeroHandler.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/handler/impl/CudaZeroHandler.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.handler.impl; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/memory/CudaMemoryManager.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/memory/CudaMemoryManager.java index c7e8f800f..f47eb38f9 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/memory/CudaMemoryManager.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/memory/CudaMemoryManager.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.memory; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/memory/MemoryProvider.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/memory/MemoryProvider.java index 923e4d00b..e634a55cd 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/memory/MemoryProvider.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/memory/MemoryProvider.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.memory; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspace.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspace.java index a04dae5d5..f40102b6c 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspace.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspace.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.workspace; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspaceDeallocator.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspaceDeallocator.java index cb07bb776..4c7b15450 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspaceDeallocator.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspaceDeallocator.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.workspace; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspaceManager.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspaceManager.java index b14e26eb5..2e3b4d453 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspaceManager.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/jita/workspace/CudaWorkspaceManager.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.jita.workspace; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/CachedShapeInfoProvider.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/CachedShapeInfoProvider.java index e46650484..3fb92b838 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/CachedShapeInfoProvider.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/CachedShapeInfoProvider.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/CublasPointer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/CublasPointer.java index d7109f5cb..9cd800ba6 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/CublasPointer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/CublasPointer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasBackend.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasBackend.java index 093d50bee..f082f414a 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasBackend.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasBackend.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasNDArray.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasNDArray.java index 465778ff5..04dfb10cb 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasNDArray.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasNDArray.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasNDArrayFactory.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasNDArrayFactory.java index 3fd839e52..8a1369856 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasNDArrayFactory.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasNDArrayFactory.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasWrapper.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasWrapper.java index 1b79f62dd..768c1f3db 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasWrapper.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/JCublasWrapper.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/CudaBlas.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/CudaBlas.java index 624460b50..ca70f9489 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/CudaBlas.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/CudaBlas.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.blas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLapack.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLapack.java index 3e3191f0d..52dea6dea 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLapack.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLapack.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.blas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel1.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel1.java index e19031e6c..e1be2f502 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel1.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel1.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.blas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel2.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel2.java index 811a7ff42..8ba0c77b5 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel2.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel2.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.blas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel3.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel3.java index ffd21a333..8299dbb6a 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel3.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/blas/JcublasLevel3.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.blas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/AddressRetriever.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/AddressRetriever.java index dfb20daf6..2c523325b 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/AddressRetriever.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/AddressRetriever.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/BaseCudaDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/BaseCudaDataBuffer.java index 132607cb8..b81a20efb 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/BaseCudaDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/BaseCudaDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaBfloat16DataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaBfloat16DataBuffer.java index d055a0824..7eddde576 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaBfloat16DataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaBfloat16DataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaBoolDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaBoolDataBuffer.java index 4bbed0004..a1b6e7f0c 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaBoolDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaBoolDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaByteDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaByteDataBuffer.java index 9649f99c2..f6d5ac379 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaByteDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaByteDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaDoubleDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaDoubleDataBuffer.java index c3e00d32b..6a9ef4b49 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaDoubleDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaDoubleDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaFloatDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaFloatDataBuffer.java index 4e7b8b0f3..720f7d88d 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaFloatDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaFloatDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaHalfDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaHalfDataBuffer.java index d15c6b07c..45521f138 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaHalfDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaHalfDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaIntDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaIntDataBuffer.java index 488f9182e..4408d6de8 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaIntDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaIntDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaLongDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaLongDataBuffer.java index 75156e694..d433e21c8 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaLongDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaLongDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaShortDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaShortDataBuffer.java index 3a54513a7..3bba37f49 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaShortDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaShortDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUByteDataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUByteDataBuffer.java index 54ff947ff..a5b6be52a 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUByteDataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUByteDataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt16DataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt16DataBuffer.java index fc51b20a2..df877815d 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt16DataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt16DataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt32DataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt32DataBuffer.java index b7aeeed48..092bd37fd 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt32DataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt32DataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt64DataBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt64DataBuffer.java index 242d1f482..454109b2f 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt64DataBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUInt64DataBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUtf8Buffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUtf8Buffer.java index 4b2e160b7..096646312 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUtf8Buffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/CudaUtf8Buffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/DevicePointerInfo.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/DevicePointerInfo.java index 77fd338c7..9c3eef8e6 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/DevicePointerInfo.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/DevicePointerInfo.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/JCudaBuffer.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/JCudaBuffer.java index 9908ac0af..af66d3dbb 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/JCudaBuffer.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/JCudaBuffer.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/factory/CudaDataBufferFactory.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/factory/CudaDataBufferFactory.java index b30fa4652..9a0f82d81 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/factory/CudaDataBufferFactory.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/buffer/factory/CudaDataBufferFactory.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.buffer.factory; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/context/ContextHolder.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/context/ContextHolder.java index cab875700..7ecbaceb9 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/context/ContextHolder.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/context/ContextHolder.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.context; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/context/CudaContext.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/context/CudaContext.java index 826bb0797..00cdcfe15 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/context/CudaContext.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/context/CudaContext.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.context; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaExecutioner.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaExecutioner.java index 1a2c019de..58b6fcb2b 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaExecutioner.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaExecutioner.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.ops.executioner; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaGridExecutioner.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaGridExecutioner.java index 77d264519..6b5f9b175 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaGridExecutioner.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaGridExecutioner.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.ops.executioner; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaOpContext.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaOpContext.java index 23e96ee64..98bd1fb60 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaOpContext.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/CudaOpContext.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.ops.executioner; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/aggregates/AggregateDescriptor.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/aggregates/AggregateDescriptor.java index 9969b5a44..99d6bb38e 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/aggregates/AggregateDescriptor.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/ops/executioner/aggregates/AggregateDescriptor.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.ops.executioner.aggregates; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/rng/CudaNativeRandom.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/rng/CudaNativeRandom.java index e5067c9c9..0401c35bd 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/rng/CudaNativeRandom.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/rng/CudaNativeRandom.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.rng; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/CudaArgs.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/CudaArgs.java index 1922d9ced..a520c842e 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/CudaArgs.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/CudaArgs.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.util; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/FFTUtils.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/FFTUtils.java index 0f26bf948..b991dc42b 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/FFTUtils.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/FFTUtils.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.util; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/OpUtil.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/OpUtil.java index f1b020116..ca118e262 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/OpUtil.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/linalg/jcublas/util/OpUtil.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.linalg.jcublas.util; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/nativeblas/Nd4jCudaPresets.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/nativeblas/Nd4jCudaPresets.java index ebf508e2b..ab2cc0550 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/nativeblas/Nd4jCudaPresets.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/java/org/nd4j/nativeblas/Nd4jCudaPresets.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.nativeblas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor index 6d9782eec..d5f00cfeb 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor @@ -1,3 +1,86 @@ +# +# /* +# * ****************************************************************************** +# * * +# * * +# * * This program and the accompanying materials are made available under the +# * * terms of the Apache License, Version 2.0 which is available at +# * * https://www.apache.org/licenses/LICENSE-2.0. +# * * +# * * See the NOTICE file distributed with this work for additional +# * * information regarding copyright ownership. +# * * Unless required by applicable law or agreed to in writing, software +# * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * * License for the specific language governing permissions and limitations +# * * under the License. +# * * +# * * SPDX-License-Identifier: Apache-2.0 +# * ***************************************************************************** +# */ +# +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/resources/META-INF/services/org.nd4j.linalg.factory.Nd4jBackend b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/resources/META-INF/services/org.nd4j.linalg.factory.Nd4jBackend index 0e77690d5..6f4d184d1 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/resources/META-INF/services/org.nd4j.linalg.factory.Nd4jBackend +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/main/resources/META-INF/services/org.nd4j.linalg.factory.Nd4jBackend @@ -1,3 +1,86 @@ +# +# /* +# * ****************************************************************************** +# * * +# * * +# * * This program and the accompanying materials are made available under the +# * * terms of the Apache License, Version 2.0 which is available at +# * * https://www.apache.org/licenses/LICENSE-2.0. +# * * +# * * See the NOTICE file distributed with this work for additional +# * * information regarding copyright ownership. +# * * Unless required by applicable law or agreed to in writing, software +# * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * * License for the specific language governing permissions and limitations +# * * under the License. +# * * +# * * SPDX-License-Identifier: Apache-2.0 +# * ***************************************************************************** +# */ +# +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/test/java/org/nd4j/linalg/jcublas/buffer/BaseCudaDataBufferTest.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/test/java/org/nd4j/linalg/jcublas/buffer/BaseCudaDataBufferTest.java index 99ebc725b..fec115300 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/test/java/org/nd4j/linalg/jcublas/buffer/BaseCudaDataBufferTest.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-cuda/src/test/java/org/nd4j/linalg/jcublas/buffer/BaseCudaDataBufferTest.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + package org.nd4j.linalg.jcublas.buffer; import lombok.extern.slf4j.Slf4j; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/java/org/nd4j/linalg/cpu/nativecpu/ops/NativeOpExecutioner.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/java/org/nd4j/linalg/cpu/nativecpu/ops/NativeOpExecutioner.java index 737b44b5f..384e9ae15 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/java/org/nd4j/linalg/cpu/nativecpu/ops/NativeOpExecutioner.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/java/org/nd4j/linalg/cpu/nativecpu/ops/NativeOpExecutioner.java @@ -6,8 +6,8 @@ * * terms of the Apache License, Version 2.0 which is available at * * https://www.apache.org/licenses/LICENSE-2.0. * * - * * See the NOTICE file distributed with this work for additional - * * information regarding copyright ownership. + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. * * Unless required by applicable law or agreed to in writing, software * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/java/org/nd4j/nativeblas/Nd4jCpu.java b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/java/org/nd4j/nativeblas/Nd4jCpu.java index 457e31eff..23818981c 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/java/org/nd4j/nativeblas/Nd4jCpu.java +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/java/org/nd4j/nativeblas/Nd4jCpu.java @@ -1,3 +1,23 @@ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ + // Targeted by JavaCPP version 1.5.4: DO NOT EDIT THIS FILE package org.nd4j.nativeblas; diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor index 3e484dda4..513a2ec82 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/resources/META-INF/services/org.nd4j.linalg.compression.NDArrayCompressor @@ -1,3 +1,66 @@ +# +# /* +# * ****************************************************************************** +# * * +# * * +# * * This program and the accompanying materials are made available under the +# * * terms of the Apache License, Version 2.0 which is available at +# * * https://www.apache.org/licenses/LICENSE-2.0. +# * * +# * * See the NOTICE file distributed with this work for additional +# * * information regarding copyright ownership. +# * * Unless required by applicable law or agreed to in writing, software +# * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * * License for the specific language governing permissions and limitations +# * * under the License. +# * * +# * * SPDX-License-Identifier: Apache-2.0 +# * ***************************************************************************** +# */ +# +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/resources/META-INF/services/org.nd4j.linalg.factory.Nd4jBackend b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/resources/META-INF/services/org.nd4j.linalg.factory.Nd4jBackend index bfa3567ae..0f7564a44 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/resources/META-INF/services/org.nd4j.linalg.factory.Nd4jBackend +++ b/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/src/main/resources/META-INF/services/org.nd4j.linalg.factory.Nd4jBackend @@ -1,3 +1,66 @@ +# +# /* +# * ****************************************************************************** +# * * +# * * +# * * This program and the accompanying materials are made available under the +# * * terms of the Apache License, Version 2.0 which is available at +# * * https://www.apache.org/licenses/LICENSE-2.0. +# * * +# * * See the NOTICE file distributed with this work for additional +# * * information regarding copyright ownership. +# * * Unless required by applicable law or agreed to in writing, software +# * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * * License for the specific language governing permissions and limitations +# * * under the License. +# * * +# * * SPDX-License-Identifier: Apache-2.0 +# * ***************************************************************************** +# */ +# +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-backends/nd4j-backend-impls/pom.xml b/nd4j/nd4j-backends/nd4j-backend-impls/pom.xml index e2a452c8c..cc14a9520 100644 --- a/nd4j/nd4j-backends/nd4j-backend-impls/pom.xml +++ b/nd4j/nd4j-backends/nd4j-backend-impls/pom.xml @@ -36,6 +36,11 @@ nd4j-backend-impls + + nd4j-cuda + nd4j-cuda-platform + + ${project.groupId} @@ -214,7 +219,7 @@ nd4j-native-platform - + javacpp-platform-default diff --git a/nd4j/nd4j-backends/nd4j-tests-tensorflow/src/test/cpujava/org/nd4j/tensorflow/conversion/TensorflowConversionTest.java b/nd4j/nd4j-backends/nd4j-tests-tensorflow/src/test/cpujava/org/nd4j/tensorflow/conversion/TensorflowConversionTest.java index b0e70dc3d..1aabfc691 100644 --- a/nd4j/nd4j-backends/nd4j-tests-tensorflow/src/test/cpujava/org/nd4j/tensorflow/conversion/TensorflowConversionTest.java +++ b/nd4j/nd4j-backends/nd4j-tests-tensorflow/src/test/cpujava/org/nd4j/tensorflow/conversion/TensorflowConversionTest.java @@ -1,18 +1,22 @@ -/******************************************************************************* - * Copyright (c) 2015-2018 Skymind, Inc. - * - * This program and the accompanying materials are made available under the - * terms of the Apache License, Version 2.0 which is available at - * https://www.apache.org/licenses/LICENSE-2.0. - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT - * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the - * License for the specific language governing permissions and limitations - * under the License. - * - * SPDX-License-Identifier: Apache-2.0 - ******************************************************************************/ +/* + * ****************************************************************************** + * * + * * + * * This program and the accompanying materials are made available under the + * * terms of the Apache License, Version 2.0 which is available at + * * https://www.apache.org/licenses/LICENSE-2.0. + * * + * * See the NOTICE file distributed with this work for additional + * * information regarding copyright ownership. + * * Unless required by applicable law or agreed to in writing, software + * * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT + * * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the + * * License for the specific language governing permissions and limitations + * * under the License. + * * + * * SPDX-License-Identifier: Apache-2.0 + * ***************************************************************************** + */ package org.nd4j.tensorflow.conversion; diff --git a/nd4j/nd4j-parameter-server-parent/nd4j-parameter-server-node/src/main/resources/META-INF/services/org.nd4j.parameterserver.distributed.training.TrainingDriver b/nd4j/nd4j-parameter-server-parent/nd4j-parameter-server-node/src/main/resources/META-INF/services/org.nd4j.parameterserver.distributed.training.TrainingDriver index c3539599d..026671baf 100644 --- a/nd4j/nd4j-parameter-server-parent/nd4j-parameter-server-node/src/main/resources/META-INF/services/org.nd4j.parameterserver.distributed.training.TrainingDriver +++ b/nd4j/nd4j-parameter-server-parent/nd4j-parameter-server-node/src/main/resources/META-INF/services/org.nd4j.parameterserver.distributed.training.TrainingDriver @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/nd4j-tensorflow/src/main/resources/META-INF/services/org.nd4j.TFGraphRunnerService b/nd4j/nd4j-tensorflow/src/main/resources/META-INF/services/org.nd4j.TFGraphRunnerService index 8b82c9e7c..3031cd0d7 100644 --- a/nd4j/nd4j-tensorflow/src/main/resources/META-INF/services/org.nd4j.TFGraphRunnerService +++ b/nd4j/nd4j-tensorflow/src/main/resources/META-INF/services/org.nd4j.TFGraphRunnerService @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/samediff-import/samediff-import-onnx/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.ImportGraphHolder b/nd4j/samediff-import/samediff-import-onnx/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.ImportGraphHolder index 92b5e53c0..11463f910 100644 --- a/nd4j/samediff-import/samediff-import-onnx/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.ImportGraphHolder +++ b/nd4j/samediff-import/samediff-import-onnx/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.ImportGraphHolder @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/samediff-import/samediff-import-onnx/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.opdefs.OpDescriptorLoader b/nd4j/samediff-import/samediff-import-onnx/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.opdefs.OpDescriptorLoader index 203fef5e1..45eb504a8 100644 --- a/nd4j/samediff-import/samediff-import-onnx/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.opdefs.OpDescriptorLoader +++ b/nd4j/samediff-import/samediff-import-onnx/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.opdefs.OpDescriptorLoader @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/samediff-import/samediff-import-tensorflow/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.ImportGraphHolder b/nd4j/samediff-import/samediff-import-tensorflow/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.ImportGraphHolder index a0d05a3c7..2e2411e3c 100644 --- a/nd4j/samediff-import/samediff-import-tensorflow/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.ImportGraphHolder +++ b/nd4j/samediff-import/samediff-import-tensorflow/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.ImportGraphHolder @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/nd4j/samediff-import/samediff-import-tensorflow/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.opdefs.OpDescriptorLoader b/nd4j/samediff-import/samediff-import-tensorflow/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.opdefs.OpDescriptorLoader index fc9550f45..899adce3b 100644 --- a/nd4j/samediff-import/samediff-import-tensorflow/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.opdefs.OpDescriptorLoader +++ b/nd4j/samediff-import/samediff-import-tensorflow/src/main/resources/META-INF/services/org.nd4j.samediff.frameworkimport.opdefs.OpDescriptorLoader @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # * diff --git a/pom.xml b/pom.xml index 8bed8934f..df4047219 100644 --- a/pom.xml +++ b/pom.xml @@ -322,9 +322,25 @@ 0.9.1 1.0.0 2.2.0 + 1.4.30 + + + org.jetbrains.kotlin + kotlin-stdlib-jdk8 + ${kotlin.version} + + + org.jetbrains.kotlin + kotlin-test + ${kotlin.version} + test + + + + contrib @@ -456,10 +472,6 @@ - - org.apache.maven.plugins - maven-compiler-plugin - org.apache.maven.plugins maven-source-plugin @@ -506,6 +518,58 @@ true + + org.jetbrains.kotlin + kotlin-maven-plugin + ${kotlin.version} + + + compile + compile + + compile + + + + test-compile + test-compile + + test-compile + + + + + 1.8 + + + + org.apache.maven.plugins + maven-compiler-plugin + + + default-compile + none + + + default-testCompile + none + + + compile + compile + + compile + + + + testCompile + test-compile + + testCompile + + + + diff --git a/python4j/python4j-numpy/src/main/resources/META-INF/services/org.nd4j.python4j.PythonType b/python4j/python4j-numpy/src/main/resources/META-INF/services/org.nd4j.python4j.PythonType index 41ac0979f..52597b8b5 100644 --- a/python4j/python4j-numpy/src/main/resources/META-INF/services/org.nd4j.python4j.PythonType +++ b/python4j/python4j-numpy/src/main/resources/META-INF/services/org.nd4j.python4j.PythonType @@ -1,3 +1,43 @@ +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + +# +# /* ****************************************************************************** +# * +# * +# * This program and the accompanying materials are made available under the +# * terms of the Apache License, Version 2.0 which is available at +# * https://www.apache.org/licenses/LICENSE-2.0. +# * +# * See the NOTICE file distributed with this work for additional +# * information regarding copyright ownership. +# * Unless required by applicable law or agreed to in writing, software +# * distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# * License for the specific language governing permissions and limitations +# * under the License. +# * +# * SPDX-License-Identifier: Apache-2.0 +# ******************************************************************************/ +# + # # /* ****************************************************************************** # *