Skip to content

Conversation

@memoryz
Copy link
Contributor

@memoryz memoryz commented Nov 12, 2023

Related Issues/PRs

#1996

What changes are proposed in this pull request?

Adding support for Boolean input in ONNX model transform

How is this patch tested?

  • I have written tests (not required for typo or doc fix) and confirmed the proposed feature/bug-fix/change works.

Does this PR change any dependencies?

  • No. You can skip this section.
  • Yes. Make sure the dependencies are resolved correctly, and list changes here.

Does this PR add a new feature? If so, have you added samples on website?

  • No. You can skip this section.
  • Yes. Make sure you have added samples following below steps.
  1. Find the corresponding markdown file for your new feature in website/docs/documentation folder.
    Make sure you choose the correct class estimators/transformers and namespace.
  2. Follow the pattern in markdown file and add another section for your new API, including pyspark, scala (and .NET potentially) samples.
  3. Make sure the DocTable points to correct API link.
  4. Navigate to website folder, and run yarn run start to make sure the website renders correctly.
  5. Don't forget to add <!--pytest-codeblocks:cont--> before each python code blocks to enable auto-tests for python samples.
  6. Make sure the WebsiteSamplesTests job pass in the pipeline.

@github-actions
Copy link

Hey @memoryz 👋!
Thank you so much for contributing to our repository 🙌.
Someone from SynapseML Team will be reviewing this pull request soon.

We use semantic commit messages to streamline the release process.
Before your pull request can be merged, you should make sure your first commit and PR title start with a semantic prefix.
This helps us to create release messages and credit you for your hard work!

Examples of commit messages with semantic prefixes:

  • fix: Fix LightGBM crashes with empty partitions
  • feat: Make HTTP on Spark back-offs configurable
  • docs: Update Spark Serving usage
  • build: Add codecov support
  • perf: improve LightGBM memory usage
  • refactor: make python code generation rely on classes
  • style: Remove nulls from CNTKModel
  • test: Add test coverage for CNTKModel

To test your commit locally, please follow our guild on building from source.
Check out the developer guide for additional guidance on testing your change.

@memoryz memoryz linked an issue Nov 12, 2023 that may be closed by this pull request
@memoryz memoryz marked this pull request as ready for review November 12, 2023 20:55
@memoryz
Copy link
Contributor Author

memoryz commented Nov 12, 2023

To generate a test model:

from onnx import TensorProto
from onnx.helper import (make_node, make_graph, make_model_gen_version, make_opsetid, make_tensor_value_info)
from onnx.checker import check_model

A = make_tensor_value_info('A', TensorProto.BOOL, [None])
B = make_tensor_value_info('B', TensorProto.BOOL, [None])
Y = make_tensor_value_info('Y', TensorProto.BOOL, [None])
node = make_node('And', ['A', 'B'], ['Y'])
graph = make_graph([node], 'And', [A, B], [Y])
onnx_model = make_model_gen_version(graph, opset_imports=[make_opsetid("", 7)])
check_model(onnx_model)
with open("GH1996.onnx", "wb") as f:
    f.write(onnx_model.SerializeToString())

To test:

%%configure -f
{
  "conf": {
      "spark.jars.packages": "com.microsoft.azure:synapseml_2.12:1.0.1-4-f7d05dab-SNAPSHOT",
      "spark.jars.repositories": "https://mmlspark.azureedge.net/maven",
      "spark.jars.excludes": "org.scala-lang:scala-reflect,org.apache.spark:spark-tags_2.12,org.scalactic:scalactic_2.12,org.scalatest:scalatest_2.12,com.fasterxml.jackson.core:jackson-databind",
      "spark.yarn.user.classpath.first": "true",
      "spark.sql.parquet.enableVectorizedReader": "false"
  }
}
import com.microsoft.azure.synapse.ml.onnx.ONNXModel

val df = Seq((true, true), (true, false), (false, false)).toDF("i1", "i2")

val model: ONNXModel = new ONNXModel()
  .setModelLocation("wasbs://[email protected]/ONNXModels/GH1996.onnx")
  .setFeedDict(Map("A" -> "i1", "B" -> "i2"))
  .setFetchDict(Map("Output" -> "Y"))

display(model.transform(df))

Expected output:

i1 i2 Output
true true true
true false false
false false false

@memoryz
Copy link
Contributor Author

memoryz commented Nov 12, 2023

/azp run

@azure-pipelines
Copy link

Supported commands
  • help:
    • Get descriptions, examples and documentation about supported commands
    • Example: help "command_name"
  • list:
    • List all pipelines for this repository using a comment.
    • Example: "list"
  • run:
    • Run all pipelines or specific pipelines for this repository using a comment. Use this command by itself to trigger all related pipelines, or specify specific pipelines to run.
    • Example: "run" or "run pipeline_name, pipeline_name, pipeline_name"
  • where:
    • Report back the Azure DevOps orgs that are related to this repository and org
    • Example: "where"

See additional documentation.

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@codecov-commenter
Copy link

codecov-commenter commented Nov 12, 2023

Codecov Report

Merging #2130 (f7d05da) into master (90ded80) will decrease coverage by 0.08%.
The diff coverage is 87.50%.

@@            Coverage Diff             @@
##           master    #2130      +/-   ##
==========================================
- Coverage   85.84%   85.76%   -0.08%     
==========================================
  Files         312      312              
  Lines       16470    16477       +7     
  Branches     1458     1460       +2     
==========================================
- Hits        14138    14132       -6     
- Misses       2332     2345      +13     
Files Coverage Δ
...om/microsoft/azure/synapse/ml/onnx/ONNXUtils.scala 77.10% <87.50%> (+1.63%) ⬆️

... and 4 files with indirect coverage changes

@mhamilton723 mhamilton723 merged commit 9195dee into microsoft:master Nov 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support to Bool input for Onnx models

3 participants