Skip to content
This repository was archived by the owner on Sep 5, 2023. It is now read-only.
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: googleapis/python-datalabeling
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.4.1
Choose a base ref
...
head repository: googleapis/python-datalabeling
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v1.0.0
Choose a head ref
  • 2 commits
  • 89 files changed
  • 3 contributors

Commits on Aug 12, 2020

  1. feat!: migrate to use microgen (#34)

    * feat!: migrate to use microgen
    
    * update sample
    
    * update sample
    
    * update sample
    
    * Update UPGRADING.md
    
    Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com>
    
    Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com>
    arithmetic1728 and busunkim96 authored Aug 12, 2020

    Partially verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    We cannot verify signatures from co-authors, and some of the co-authors attributed to this commit require their commits to be signed.
    Copy the full SHA
    465eb36 View commit details
  2. chore: release 1.0.0 (#35)

    * chore: updated CHANGELOG.md [ci skip]
    
    * chore: updated setup.cfg [ci skip]
    
    * chore: updated setup.py
    
    Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
    release-please[bot] authored Aug 12, 2020

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    d884e11 View commit details
Showing with 26,538 additions and 24,937 deletions.
  1. +6 −7 .coveragerc
  2. +11 −0 CHANGELOG.md
  3. +3 −4 README.rst
  4. +169 −0 UPGRADING.md
  5. +1 −0 docs/UPGRADING.md
  6. +6 −0 docs/datalabeling_v1beta1/services.rst
  7. +5 −0 docs/datalabeling_v1beta1/types.rst
  8. +0 −6 docs/gapic/v1beta1/api.rst
  9. +0 −5 docs/gapic/v1beta1/types.rst
  10. +13 −3 docs/index.rst
  11. +0 −29 google/cloud/datalabeling.py
  12. +460 −0 google/cloud/datalabeling/__init__.py
  13. +2 −0 google/cloud/datalabeling/py.typed
  14. +278 −14 google/cloud/datalabeling_v1beta1/__init__.py
  15. 0 google/cloud/datalabeling_v1beta1/gapic/__init__.py
  16. +0 −3,380 google/cloud/datalabeling_v1beta1/gapic/data_labeling_service_client.py
  17. +0 −212 google/cloud/datalabeling_v1beta1/gapic/data_labeling_service_client_config.py
  18. +0 −228 google/cloud/datalabeling_v1beta1/gapic/enums.py
  19. 0 google/cloud/datalabeling_v1beta1/gapic/transports/__init__.py
  20. +0 −580 google/cloud/datalabeling_v1beta1/gapic/transports/data_labeling_service_grpc_transport.py
  21. 0 google/cloud/datalabeling_v1beta1/proto/__init__.py
  22. +0 −2,583 google/cloud/datalabeling_v1beta1/proto/annotation_pb2.py
  23. +0 −3 google/cloud/datalabeling_v1beta1/proto/annotation_pb2_grpc.py
  24. +0 −278 google/cloud/datalabeling_v1beta1/proto/annotation_spec_set_pb2.py
  25. +0 −3 google/cloud/datalabeling_v1beta1/proto/annotation_spec_set_pb2_grpc.py
  26. +0 −5,074 google/cloud/datalabeling_v1beta1/proto/data_labeling_service_pb2.py
  27. +0 −1,571 google/cloud/datalabeling_v1beta1/proto/data_labeling_service_pb2_grpc.py
  28. +0 −448 google/cloud/datalabeling_v1beta1/proto/data_payloads_pb2.py
  29. +0 −3 google/cloud/datalabeling_v1beta1/proto/data_payloads_pb2_grpc.py
  30. +0 −2,278 google/cloud/datalabeling_v1beta1/proto/dataset_pb2.py
  31. +0 −3 google/cloud/datalabeling_v1beta1/proto/dataset_pb2_grpc.py
  32. +0 −1,036 google/cloud/datalabeling_v1beta1/proto/evaluation_job_pb2.py
  33. +0 −3 google/cloud/datalabeling_v1beta1/proto/evaluation_job_pb2_grpc.py
  34. +0 −1,280 google/cloud/datalabeling_v1beta1/proto/evaluation_pb2.py
  35. +0 −3 google/cloud/datalabeling_v1beta1/proto/evaluation_pb2_grpc.py
  36. +0 −1,326 google/cloud/datalabeling_v1beta1/proto/human_annotation_config_pb2.py
  37. +0 −3 google/cloud/datalabeling_v1beta1/proto/human_annotation_config_pb2_grpc.py
  38. +0 −414 google/cloud/datalabeling_v1beta1/proto/instruction_pb2.py
  39. +0 −3 google/cloud/datalabeling_v1beta1/proto/instruction_pb2_grpc.py
  40. +0 −1,918 google/cloud/datalabeling_v1beta1/proto/operations_pb2.py
  41. +0 −3 google/cloud/datalabeling_v1beta1/proto/operations_pb2_grpc.py
  42. +2 −0 google/cloud/datalabeling_v1beta1/py.typed
  43. +3 −11 google/{ → cloud/datalabeling_v1beta1/services}/__init__.py
  44. +9 −9 google/cloud/{ → datalabeling_v1beta1/services/data_labeling_service}/__init__.py
  45. +3,140 −0 google/cloud/datalabeling_v1beta1/services/data_labeling_service/async_client.py
  46. +3,227 −0 google/cloud/datalabeling_v1beta1/services/data_labeling_service/client.py
  47. +1,209 −0 google/cloud/datalabeling_v1beta1/services/data_labeling_service/pagers.py
  48. +38 −0 google/cloud/datalabeling_v1beta1/services/data_labeling_service/transports/__init__.py
  49. +800 −0 google/cloud/datalabeling_v1beta1/services/data_labeling_service/transports/base.py
  50. +1,201 −0 google/cloud/datalabeling_v1beta1/services/data_labeling_service/transports/grpc.py
  51. +1,231 −0 google/cloud/datalabeling_v1beta1/services/data_labeling_service/transports/grpc_asyncio.py
  52. +0 −80 google/cloud/datalabeling_v1beta1/types.py
  53. +301 −0 google/cloud/datalabeling_v1beta1/types/__init__.py
  54. +588 −0 google/cloud/datalabeling_v1beta1/types/annotation.py
  55. +92 −0 google/cloud/datalabeling_v1beta1/types/annotation_spec_set.py
  56. +1,168 −0 google/cloud/datalabeling_v1beta1/types/data_labeling_service.py
  57. +112 −0 google/cloud/datalabeling_v1beta1/types/data_payloads.py
  58. +555 −0 google/cloud/datalabeling_v1beta1/types/dataset.py
  59. +334 −0 google/cloud/datalabeling_v1beta1/types/evaluation.py
  60. +339 −0 google/cloud/datalabeling_v1beta1/types/evaluation_job.py
  61. +326 −0 google/cloud/datalabeling_v1beta1/types/human_annotation_config.py
  62. +115 −0 google/cloud/datalabeling_v1beta1/types/instruction.py
  63. +494 −0 google/cloud/datalabeling_v1beta1/types/operations.py
  64. +3 −0 mypy.ini
  65. +5 −3 noxfile.py
  66. +25 −28 samples/snippets/create_annotation_spec_set.py
  67. +5 −5 samples/snippets/create_annotation_spec_set_test.py
  68. +33 −40 samples/snippets/create_instruction.py
  69. +9 −8 samples/snippets/create_instruction_test.py
  70. +33 −29 samples/snippets/export_data.py
  71. +24 −20 samples/snippets/import_data.py
  72. +10 −7 samples/snippets/import_data_test.py
  73. +34 −30 samples/snippets/label_image.py
  74. +18 −16 samples/snippets/label_image_test.py
  75. +35 −30 samples/snippets/label_text.py
  76. +16 −14 samples/snippets/label_text_test.py
  77. +33 −28 samples/snippets/label_video.py
  78. +18 −16 samples/snippets/label_video_test.py
  79. +70 −61 samples/snippets/manage_dataset.py
  80. +10 −8 samples/snippets/manage_dataset_test.py
  81. +12 −14 samples/snippets/noxfile.py
  82. +8 −8 samples/snippets/testing_lib.py
  83. +211 −0 scripts/fixup_datalabeling_v1beta1_keywords.py
  84. +10 −7 setup.py
  85. +3 −3 synth.metadata
  86. +7 −49 synth.py
  87. +1 −0 tests/unit/gapic/datalabeling_v1beta1/__init__.py
  88. +9,667 −0 tests/unit/gapic/datalabeling_v1beta1/test_data_labeling_service.py
  89. +0 −1,712 tests/unit/gapic/v1beta1/test_data_labeling_service_client_v1beta1.py
13 changes: 6 additions & 7 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -21,15 +21,14 @@ branch = True
[report]
fail_under = 100
show_missing = True
omit = google/cloud/datalabeling/__init__.py
exclude_lines =
# Re-enable the standard pragma
pragma: NO COVER
# Ignore debug-only repr
def __repr__
# Ignore abstract methods
raise NotImplementedError
omit =
*/gapic/*.py
*/proto/*.py
*/core/*.py
*/site-packages/*.py
# Ignore pkg_resources exceptions.
# This is added at the module level as a safeguard for if someone
# generates the code and tries to run it without pip installing. This
# makes it virtually impossible to test properly.
except pkg_resources.DistributionNotFound
11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -4,6 +4,17 @@

[1]: https://pypi.org/project/google-cloud-datalabeling/#history

## [1.0.0](https://www.github.com/googleapis/python-datalabeling/compare/v0.4.1...v1.0.0) (2020-08-12)


### ⚠ BREAKING CHANGES

* migrate to use microgen (#34)

### Features

* migrate to use microgen ([#34](https://www.github.com/googleapis/python-datalabeling/issues/34)) ([465eb36](https://www.github.com/googleapis/python-datalabeling/commit/465eb361d39d08029f30b36c769252c9f83e7949))

### [0.4.1](https://www.github.com/googleapis/python-datalabeling/compare/v0.4.0...v0.4.1) (2020-08-07)


7 changes: 3 additions & 4 deletions README.rst
Original file line number Diff line number Diff line change
@@ -44,14 +44,13 @@ dependencies.

Supported Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^

Python >= 3.5

Python >= 3.6

Deprecated Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^^
Python == 2.7.

Python == 2.7. Python 2.7 support will be removed on January 1, 2020.
The last version of this library compatible with Python 2.7 is google-cloud-datalabeling==0.4.1.


Mac/Linux
169 changes: 169 additions & 0 deletions UPGRADING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,169 @@
# 1.0.0 Migration Guide

The 1.0 release of the `google-cloud-datalabeling` client is a significant upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python), and includes substantial interface changes. Existing code written for earlier versions of this library will likely require updates to use this version. This document describes the changes that have been made, and what you need to do to update your usage.

If you experience issues or have questions, please file an [issue](https://github.com/googleapis/python-datalabeling/issues).

## Supported Python Versions

> **WARNING**: Breaking change
The 1.0.0 release requires Python 3.6+.


## Method Calls

> **WARNING**: Breaking change
Methods expect request objects. We provide a script that will convert most common use cases.

* Install the library

```py
python3 -m pip install google-cloud-datalabeling
```

* The script `fixup_datalabeling_v1_keywords.py` is shipped with the library. It expects an input directory (with the code to convert) and an empty destination directory.

```sh
$ fixup_datalabeling_v1_keywords.py --input-directory .samples/ --output-directory samples/
```

**Before:**
```py
from google.cloud import datalabeling

client = datalabeling.DataLabelingServiceClient()

datasets = client.list_datasets(parent="projects/project")
```


**After:**
```py
from google.cloud import datalabeling

client = datalabeling.DataLabelingServiceClient()

datasets = client.list_datasets(request={"parent": "projects/project"})
```

### More Details

In `google-cloud-datalabeling<1.0.0`, parameters required by the API were positional parameters and optional parameters were keyword parameters.

**Before:**
```py
def create_dataset(
self,
parent,
dataset,
retry=google.api_core.gapic_v1.method.DEFAULT,
timeout=google.api_core.gapic_v1.method.DEFAULT,
metadata=None,
):
```

In the 1.0.0 release, all methods have a single positional parameter `request`. Method docstrings indicate whether a parameter is required or optional.

Some methods have additional keyword only parameters. The available parameters depend on the [`google.api.method_signature` annotation](https://github.com/googleapis/googleapis/blob/master/google/cloud/datalabeling/v1beta1/data_labeling_service.proto#L48) specified by the API producer.


**After:**
```py
def create_dataset(
self,
request: data_labeling_service.CreateDatasetRequest = None,
*,
parent: str = None,
dataset: gcd_dataset.Dataset = None,
retry: retries.Retry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> gcd_dataset.Dataset:
```

> **NOTE:** The `request` parameter and flattened keyword parameters for the API are mutually exclusive.
> Passing both will result in an error.

Both of these calls are valid:

```py
response = client.create_dataset(
request={
"parent": parent,
"dataset": dataset
}
)
```

```py
response = client.create_dataset(
parent=parent,
dataset=dataset
)
```

This call is invalid because it mixes `request` with a keyword argument `dataset`. Executing this code
will result in an error.

```py
response = client.create_dataset(
request={
"parent": parent
},
dataset=dataset
)
```



## Enums and Types


> **WARNING**: Breaking change
The submodules `enums` and `types` have been removed.

**Before:**
```py
from google.cloud import datalabeling

data_type = datalabeling.enums.DataType.IMAGE
dataset = datalabeling.types.Dataset(display_name="name")
```


**After:**
```py
from google.cloud import datalabeling

data_type = datalabeling.DataType.IMAGE
dataset = datalabeling.Dataset(display_name="name")
```

## Path Helper Methods
The following path helper methods have been removed. Please construct the paths manually.

```py
project="project"
dataset="dataset"
annotated_dataset="annotated_dataset"
annotation_spec_set="annotation_spec_set"
data_item="data_item"
evaluation="evaluation"
evaluation_job="evaluation_job"
example="example"
instruction="instruction"

annotated_dataset_path = f'projects/{project}/datasets/{dataset}/annotatedDatasets/{annotated_dataset}'
annotation_spec_set_path = f'projects/{project}/annotationSpecSets/{annotation_spec_set}'
data_item_path=f'projects/{project}/datasets/{dataset}/dataItems/{data_item}'
dataset_path=f'projects/{project}/datasets/{dataset}'
evaluation_path=f'projects/{project}/datasets/{dataset}/evaluations/{evaluation}'
evaluation_job_path=f'projects/{project}/evaluationJobs/{evaluation_job}'
example_path=f'projects/{project}/datasets/{dataset}/annotatedDatasets/{annotated_dataset}/examples/{example}'
instruction_path=f'projects/{project}/instructions/{instruction}'
project_path=f'projects/{project}'
```
1 change: 1 addition & 0 deletions docs/UPGRADING.md
6 changes: 6 additions & 0 deletions docs/datalabeling_v1beta1/services.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
Services for Google Cloud Datalabeling v1beta1 API
==================================================

.. automodule:: google.cloud.datalabeling_v1beta1.services.data_labeling_service
:members:
:inherited-members:
5 changes: 5 additions & 0 deletions docs/datalabeling_v1beta1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
Types for Google Cloud Datalabeling v1beta1 API
===============================================

.. automodule:: google.cloud.datalabeling_v1beta1.types
:members:
6 changes: 0 additions & 6 deletions docs/gapic/v1beta1/api.rst

This file was deleted.

5 changes: 0 additions & 5 deletions docs/gapic/v1beta1/types.rst

This file was deleted.

16 changes: 13 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -7,9 +7,19 @@ Api Reference
.. toctree::
:maxdepth: 2

gapic/v1beta1/api
gapic/v1beta1/types

datalabeling_v1beta1/services
datalabeling_v1beta1/types

Migration Guide
---------------

See the guide below for instructions on migrating to the 1.x release of this library.

.. toctree::
:maxdepth: 2

UPGRADING

Changelog
---------

29 changes: 0 additions & 29 deletions google/cloud/datalabeling.py

This file was deleted.

460 changes: 460 additions & 0 deletions google/cloud/datalabeling/__init__.py

Large diffs are not rendered by default.

2 changes: 2 additions & 0 deletions google/cloud/datalabeling/py.typed
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Marker file for PEP 561.
# The google-cloud-datalabeling package uses inline types.
292 changes: 278 additions & 14 deletions google/cloud/datalabeling_v1beta1/__init__.py
Original file line number Diff line number Diff line change
@@ -1,29 +1,293 @@
# -*- coding: utf-8 -*-
#
# Copyright 2019 Google LLC

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

from __future__ import absolute_import

from google.cloud.datalabeling_v1beta1 import types
from google.cloud.datalabeling_v1beta1.gapic import data_labeling_service_client
from google.cloud.datalabeling_v1beta1.gapic import enums


class DataLabelingServiceClient(data_labeling_service_client.DataLabelingServiceClient):
__doc__ = data_labeling_service_client.DataLabelingServiceClient.__doc__
enums = enums
from .services.data_labeling_service import DataLabelingServiceClient
from .types.annotation import Annotation
from .types.annotation import AnnotationMetadata
from .types.annotation import AnnotationSentiment
from .types.annotation import AnnotationSource
from .types.annotation import AnnotationType
from .types.annotation import AnnotationValue
from .types.annotation import BoundingPoly
from .types.annotation import ImageBoundingPolyAnnotation
from .types.annotation import ImageClassificationAnnotation
from .types.annotation import ImagePolylineAnnotation
from .types.annotation import ImageSegmentationAnnotation
from .types.annotation import NormalizedBoundingPoly
from .types.annotation import NormalizedPolyline
from .types.annotation import NormalizedVertex
from .types.annotation import ObjectTrackingFrame
from .types.annotation import OperatorMetadata
from .types.annotation import Polyline
from .types.annotation import SequentialSegment
from .types.annotation import TextClassificationAnnotation
from .types.annotation import TextEntityExtractionAnnotation
from .types.annotation import TimeSegment
from .types.annotation import Vertex
from .types.annotation import VideoClassificationAnnotation
from .types.annotation import VideoEventAnnotation
from .types.annotation import VideoObjectTrackingAnnotation
from .types.annotation_spec_set import AnnotationSpec
from .types.annotation_spec_set import AnnotationSpecSet
from .types.data_labeling_service import CreateAnnotationSpecSetRequest
from .types.data_labeling_service import CreateDatasetRequest
from .types.data_labeling_service import CreateEvaluationJobRequest
from .types.data_labeling_service import CreateInstructionRequest
from .types.data_labeling_service import DeleteAnnotatedDatasetRequest
from .types.data_labeling_service import DeleteAnnotationSpecSetRequest
from .types.data_labeling_service import DeleteDatasetRequest
from .types.data_labeling_service import DeleteEvaluationJobRequest
from .types.data_labeling_service import DeleteInstructionRequest
from .types.data_labeling_service import ExportDataRequest
from .types.data_labeling_service import GetAnnotatedDatasetRequest
from .types.data_labeling_service import GetAnnotationSpecSetRequest
from .types.data_labeling_service import GetDataItemRequest
from .types.data_labeling_service import GetDatasetRequest
from .types.data_labeling_service import GetEvaluationJobRequest
from .types.data_labeling_service import GetEvaluationRequest
from .types.data_labeling_service import GetExampleRequest
from .types.data_labeling_service import GetInstructionRequest
from .types.data_labeling_service import ImportDataRequest
from .types.data_labeling_service import LabelImageRequest
from .types.data_labeling_service import LabelTextRequest
from .types.data_labeling_service import LabelVideoRequest
from .types.data_labeling_service import ListAnnotatedDatasetsRequest
from .types.data_labeling_service import ListAnnotatedDatasetsResponse
from .types.data_labeling_service import ListAnnotationSpecSetsRequest
from .types.data_labeling_service import ListAnnotationSpecSetsResponse
from .types.data_labeling_service import ListDataItemsRequest
from .types.data_labeling_service import ListDataItemsResponse
from .types.data_labeling_service import ListDatasetsRequest
from .types.data_labeling_service import ListDatasetsResponse
from .types.data_labeling_service import ListEvaluationJobsRequest
from .types.data_labeling_service import ListEvaluationJobsResponse
from .types.data_labeling_service import ListExamplesRequest
from .types.data_labeling_service import ListExamplesResponse
from .types.data_labeling_service import ListInstructionsRequest
from .types.data_labeling_service import ListInstructionsResponse
from .types.data_labeling_service import PauseEvaluationJobRequest
from .types.data_labeling_service import ResumeEvaluationJobRequest
from .types.data_labeling_service import SearchEvaluationsRequest
from .types.data_labeling_service import SearchEvaluationsResponse
from .types.data_labeling_service import SearchExampleComparisonsRequest
from .types.data_labeling_service import SearchExampleComparisonsResponse
from .types.data_labeling_service import UpdateEvaluationJobRequest
from .types.data_payloads import ImagePayload
from .types.data_payloads import TextPayload
from .types.data_payloads import VideoPayload
from .types.data_payloads import VideoThumbnail
from .types.dataset import AnnotatedDataset
from .types.dataset import AnnotatedDatasetMetadata
from .types.dataset import BigQuerySource
from .types.dataset import ClassificationMetadata
from .types.dataset import DataItem
from .types.dataset import DataType
from .types.dataset import Dataset
from .types.dataset import Example
from .types.dataset import GcsDestination
from .types.dataset import GcsFolderDestination
from .types.dataset import GcsSource
from .types.dataset import InputConfig
from .types.dataset import LabelStats
from .types.dataset import OutputConfig
from .types.dataset import TextMetadata
from .types.evaluation import BoundingBoxEvaluationOptions
from .types.evaluation import ClassificationMetrics
from .types.evaluation import ConfusionMatrix
from .types.evaluation import Evaluation
from .types.evaluation import EvaluationConfig
from .types.evaluation import EvaluationMetrics
from .types.evaluation import ObjectDetectionMetrics
from .types.evaluation import PrCurve
from .types.evaluation_job import Attempt
from .types.evaluation_job import EvaluationJob
from .types.evaluation_job import EvaluationJobAlertConfig
from .types.evaluation_job import EvaluationJobConfig
from .types.human_annotation_config import BoundingPolyConfig
from .types.human_annotation_config import EventConfig
from .types.human_annotation_config import HumanAnnotationConfig
from .types.human_annotation_config import ImageClassificationConfig
from .types.human_annotation_config import ObjectDetectionConfig
from .types.human_annotation_config import ObjectTrackingConfig
from .types.human_annotation_config import PolylineConfig
from .types.human_annotation_config import SegmentationConfig
from .types.human_annotation_config import SentimentConfig
from .types.human_annotation_config import StringAggregationType
from .types.human_annotation_config import TextClassificationConfig
from .types.human_annotation_config import TextEntityExtractionConfig
from .types.human_annotation_config import VideoClassificationConfig
from .types.instruction import CsvInstruction
from .types.instruction import Instruction
from .types.instruction import PdfInstruction
from .types.operations import CreateInstructionMetadata
from .types.operations import ExportDataOperationMetadata
from .types.operations import ExportDataOperationResponse
from .types.operations import ImportDataOperationMetadata
from .types.operations import ImportDataOperationResponse
from .types.operations import LabelImageBoundingBoxOperationMetadata
from .types.operations import LabelImageBoundingPolyOperationMetadata
from .types.operations import LabelImageClassificationOperationMetadata
from .types.operations import LabelImageOrientedBoundingBoxOperationMetadata
from .types.operations import LabelImagePolylineOperationMetadata
from .types.operations import LabelImageSegmentationOperationMetadata
from .types.operations import LabelOperationMetadata
from .types.operations import LabelTextClassificationOperationMetadata
from .types.operations import LabelTextEntityExtractionOperationMetadata
from .types.operations import LabelVideoClassificationOperationMetadata
from .types.operations import LabelVideoEventOperationMetadata
from .types.operations import LabelVideoObjectDetectionOperationMetadata
from .types.operations import LabelVideoObjectTrackingOperationMetadata


__all__ = ("enums", "types", "DataLabelingServiceClient")
__all__ = (
"AnnotatedDataset",
"AnnotatedDatasetMetadata",
"Annotation",
"AnnotationMetadata",
"AnnotationSentiment",
"AnnotationSource",
"AnnotationSpec",
"AnnotationSpecSet",
"AnnotationType",
"AnnotationValue",
"Attempt",
"BigQuerySource",
"BoundingBoxEvaluationOptions",
"BoundingPoly",
"BoundingPolyConfig",
"ClassificationMetadata",
"ClassificationMetrics",
"ConfusionMatrix",
"CreateAnnotationSpecSetRequest",
"CreateDatasetRequest",
"CreateEvaluationJobRequest",
"CreateInstructionMetadata",
"CreateInstructionRequest",
"CsvInstruction",
"DataItem",
"DataType",
"Dataset",
"DeleteAnnotatedDatasetRequest",
"DeleteAnnotationSpecSetRequest",
"DeleteDatasetRequest",
"DeleteEvaluationJobRequest",
"DeleteInstructionRequest",
"Evaluation",
"EvaluationConfig",
"EvaluationJob",
"EvaluationJobAlertConfig",
"EvaluationJobConfig",
"EvaluationMetrics",
"EventConfig",
"Example",
"ExportDataOperationMetadata",
"ExportDataOperationResponse",
"ExportDataRequest",
"GcsDestination",
"GcsFolderDestination",
"GcsSource",
"GetAnnotatedDatasetRequest",
"GetAnnotationSpecSetRequest",
"GetDataItemRequest",
"GetDatasetRequest",
"GetEvaluationJobRequest",
"GetEvaluationRequest",
"GetExampleRequest",
"GetInstructionRequest",
"HumanAnnotationConfig",
"ImageBoundingPolyAnnotation",
"ImageClassificationAnnotation",
"ImageClassificationConfig",
"ImagePayload",
"ImagePolylineAnnotation",
"ImageSegmentationAnnotation",
"ImportDataOperationMetadata",
"ImportDataOperationResponse",
"ImportDataRequest",
"InputConfig",
"Instruction",
"LabelImageBoundingBoxOperationMetadata",
"LabelImageBoundingPolyOperationMetadata",
"LabelImageClassificationOperationMetadata",
"LabelImageOrientedBoundingBoxOperationMetadata",
"LabelImagePolylineOperationMetadata",
"LabelImageRequest",
"LabelImageSegmentationOperationMetadata",
"LabelOperationMetadata",
"LabelStats",
"LabelTextClassificationOperationMetadata",
"LabelTextEntityExtractionOperationMetadata",
"LabelTextRequest",
"LabelVideoClassificationOperationMetadata",
"LabelVideoEventOperationMetadata",
"LabelVideoObjectDetectionOperationMetadata",
"LabelVideoObjectTrackingOperationMetadata",
"LabelVideoRequest",
"ListAnnotatedDatasetsRequest",
"ListAnnotatedDatasetsResponse",
"ListAnnotationSpecSetsRequest",
"ListAnnotationSpecSetsResponse",
"ListDataItemsRequest",
"ListDataItemsResponse",
"ListDatasetsRequest",
"ListDatasetsResponse",
"ListEvaluationJobsRequest",
"ListEvaluationJobsResponse",
"ListExamplesRequest",
"ListExamplesResponse",
"ListInstructionsRequest",
"ListInstructionsResponse",
"NormalizedBoundingPoly",
"NormalizedPolyline",
"NormalizedVertex",
"ObjectDetectionConfig",
"ObjectDetectionMetrics",
"ObjectTrackingConfig",
"ObjectTrackingFrame",
"OperatorMetadata",
"OutputConfig",
"PauseEvaluationJobRequest",
"PdfInstruction",
"Polyline",
"PolylineConfig",
"PrCurve",
"ResumeEvaluationJobRequest",
"SearchEvaluationsRequest",
"SearchEvaluationsResponse",
"SearchExampleComparisonsRequest",
"SearchExampleComparisonsResponse",
"SegmentationConfig",
"SentimentConfig",
"SequentialSegment",
"StringAggregationType",
"TextClassificationAnnotation",
"TextClassificationConfig",
"TextEntityExtractionAnnotation",
"TextEntityExtractionConfig",
"TextMetadata",
"TextPayload",
"TimeSegment",
"UpdateEvaluationJobRequest",
"Vertex",
"VideoClassificationAnnotation",
"VideoClassificationConfig",
"VideoEventAnnotation",
"VideoObjectTrackingAnnotation",
"VideoPayload",
"VideoThumbnail",
"DataLabelingServiceClient",
)
Empty file.
3,380 changes: 0 additions & 3,380 deletions google/cloud/datalabeling_v1beta1/gapic/data_labeling_service_client.py

This file was deleted.

This file was deleted.

228 changes: 0 additions & 228 deletions google/cloud/datalabeling_v1beta1/gapic/enums.py

This file was deleted.

Empty file.

This file was deleted.

Empty file.
2,583 changes: 0 additions & 2,583 deletions google/cloud/datalabeling_v1beta1/proto/annotation_pb2.py

This file was deleted.

This file was deleted.

278 changes: 0 additions & 278 deletions google/cloud/datalabeling_v1beta1/proto/annotation_spec_set_pb2.py

This file was deleted.

This file was deleted.

5,074 changes: 0 additions & 5,074 deletions google/cloud/datalabeling_v1beta1/proto/data_labeling_service_pb2.py

This file was deleted.

1,571 changes: 0 additions & 1,571 deletions google/cloud/datalabeling_v1beta1/proto/data_labeling_service_pb2_grpc.py

This file was deleted.

448 changes: 0 additions & 448 deletions google/cloud/datalabeling_v1beta1/proto/data_payloads_pb2.py

This file was deleted.

This file was deleted.

2,278 changes: 0 additions & 2,278 deletions google/cloud/datalabeling_v1beta1/proto/dataset_pb2.py

This file was deleted.

3 changes: 0 additions & 3 deletions google/cloud/datalabeling_v1beta1/proto/dataset_pb2_grpc.py

This file was deleted.

1,036 changes: 0 additions & 1,036 deletions google/cloud/datalabeling_v1beta1/proto/evaluation_job_pb2.py

This file was deleted.

This file was deleted.

1,280 changes: 0 additions & 1,280 deletions google/cloud/datalabeling_v1beta1/proto/evaluation_pb2.py

This file was deleted.

This file was deleted.

1,326 changes: 0 additions & 1,326 deletions google/cloud/datalabeling_v1beta1/proto/human_annotation_config_pb2.py

This file was deleted.

This file was deleted.

414 changes: 0 additions & 414 deletions google/cloud/datalabeling_v1beta1/proto/instruction_pb2.py

This file was deleted.

This file was deleted.

1,918 changes: 0 additions & 1,918 deletions google/cloud/datalabeling_v1beta1/proto/operations_pb2.py

This file was deleted.

This file was deleted.

2 changes: 2 additions & 0 deletions google/cloud/datalabeling_v1beta1/py.typed
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Marker file for PEP 561.
# The google-cloud-datalabeling package uses inline types.
Original file line number Diff line number Diff line change
@@ -1,24 +1,16 @@
# -*- coding: utf-8 -*-
#

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

try:
import pkg_resources

pkg_resources.declare_namespace(__name__)
except ImportError:
import pkgutil

__path__ = pkgutil.extend_path(__path__, __name__)
#
Original file line number Diff line number Diff line change
@@ -1,24 +1,24 @@
# -*- coding: utf-8 -*-
#

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

try:
import pkg_resources

pkg_resources.declare_namespace(__name__)
except ImportError:
import pkgutil
from .client import DataLabelingServiceClient
from .async_client import DataLabelingServiceAsyncClient

__path__ = pkgutil.extend_path(__path__, __name__)
__all__ = (
"DataLabelingServiceClient",
"DataLabelingServiceAsyncClient",
)

Large diffs are not rendered by default.

3,227 changes: 3,227 additions & 0 deletions google/cloud/datalabeling_v1beta1/services/data_labeling_service/client.py

Large diffs are not rendered by default.

1,209 changes: 1,209 additions & 0 deletions google/cloud/datalabeling_v1beta1/services/data_labeling_service/pagers.py

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

from collections import OrderedDict
from typing import Dict, Type

from .base import DataLabelingServiceTransport
from .grpc import DataLabelingServiceGrpcTransport
from .grpc_asyncio import DataLabelingServiceGrpcAsyncIOTransport


# Compile a registry of transports.
_transport_registry = (
OrderedDict()
) # type: Dict[str, Type[DataLabelingServiceTransport]]
_transport_registry["grpc"] = DataLabelingServiceGrpcTransport
_transport_registry["grpc_asyncio"] = DataLabelingServiceGrpcAsyncIOTransport


__all__ = (
"DataLabelingServiceTransport",
"DataLabelingServiceGrpcTransport",
"DataLabelingServiceGrpcAsyncIOTransport",
)

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

80 changes: 0 additions & 80 deletions google/cloud/datalabeling_v1beta1/types.py

This file was deleted.

301 changes: 301 additions & 0 deletions google/cloud/datalabeling_v1beta1/types/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,301 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

from .annotation_spec_set import (
AnnotationSpecSet,
AnnotationSpec,
)
from .annotation import (
Annotation,
AnnotationValue,
ImageClassificationAnnotation,
Vertex,
NormalizedVertex,
BoundingPoly,
NormalizedBoundingPoly,
ImageBoundingPolyAnnotation,
Polyline,
NormalizedPolyline,
ImagePolylineAnnotation,
ImageSegmentationAnnotation,
TextClassificationAnnotation,
TextEntityExtractionAnnotation,
SequentialSegment,
TimeSegment,
VideoClassificationAnnotation,
ObjectTrackingFrame,
VideoObjectTrackingAnnotation,
VideoEventAnnotation,
AnnotationMetadata,
OperatorMetadata,
)
from .data_payloads import (
ImagePayload,
TextPayload,
VideoThumbnail,
VideoPayload,
)
from .human_annotation_config import (
HumanAnnotationConfig,
ImageClassificationConfig,
BoundingPolyConfig,
PolylineConfig,
SegmentationConfig,
VideoClassificationConfig,
ObjectDetectionConfig,
ObjectTrackingConfig,
EventConfig,
TextClassificationConfig,
SentimentConfig,
TextEntityExtractionConfig,
)
from .dataset import (
Dataset,
InputConfig,
TextMetadata,
ClassificationMetadata,
GcsSource,
BigQuerySource,
OutputConfig,
GcsDestination,
GcsFolderDestination,
DataItem,
AnnotatedDataset,
LabelStats,
AnnotatedDatasetMetadata,
Example,
)
from .evaluation import (
Evaluation,
EvaluationConfig,
BoundingBoxEvaluationOptions,
EvaluationMetrics,
ClassificationMetrics,
ObjectDetectionMetrics,
PrCurve,
ConfusionMatrix,
)
from .evaluation_job import (
EvaluationJob,
EvaluationJobConfig,
EvaluationJobAlertConfig,
Attempt,
)
from .instruction import (
Instruction,
CsvInstruction,
PdfInstruction,
)
from .data_labeling_service import (
CreateDatasetRequest,
GetDatasetRequest,
ListDatasetsRequest,
ListDatasetsResponse,
DeleteDatasetRequest,
ImportDataRequest,
ExportDataRequest,
GetDataItemRequest,
ListDataItemsRequest,
ListDataItemsResponse,
GetAnnotatedDatasetRequest,
ListAnnotatedDatasetsRequest,
ListAnnotatedDatasetsResponse,
DeleteAnnotatedDatasetRequest,
LabelImageRequest,
LabelVideoRequest,
LabelTextRequest,
GetExampleRequest,
ListExamplesRequest,
ListExamplesResponse,
CreateAnnotationSpecSetRequest,
GetAnnotationSpecSetRequest,
ListAnnotationSpecSetsRequest,
ListAnnotationSpecSetsResponse,
DeleteAnnotationSpecSetRequest,
CreateInstructionRequest,
GetInstructionRequest,
DeleteInstructionRequest,
ListInstructionsRequest,
ListInstructionsResponse,
GetEvaluationRequest,
SearchEvaluationsRequest,
SearchEvaluationsResponse,
SearchExampleComparisonsRequest,
SearchExampleComparisonsResponse,
CreateEvaluationJobRequest,
UpdateEvaluationJobRequest,
GetEvaluationJobRequest,
PauseEvaluationJobRequest,
ResumeEvaluationJobRequest,
DeleteEvaluationJobRequest,
ListEvaluationJobsRequest,
ListEvaluationJobsResponse,
)
from .operations import (
ImportDataOperationResponse,
ExportDataOperationResponse,
ImportDataOperationMetadata,
ExportDataOperationMetadata,
LabelOperationMetadata,
LabelImageClassificationOperationMetadata,
LabelImageBoundingBoxOperationMetadata,
LabelImageOrientedBoundingBoxOperationMetadata,
LabelImageBoundingPolyOperationMetadata,
LabelImagePolylineOperationMetadata,
LabelImageSegmentationOperationMetadata,
LabelVideoClassificationOperationMetadata,
LabelVideoObjectDetectionOperationMetadata,
LabelVideoObjectTrackingOperationMetadata,
LabelVideoEventOperationMetadata,
LabelTextClassificationOperationMetadata,
LabelTextEntityExtractionOperationMetadata,
CreateInstructionMetadata,
)


__all__ = (
"AnnotationSpecSet",
"AnnotationSpec",
"Annotation",
"AnnotationValue",
"ImageClassificationAnnotation",
"Vertex",
"NormalizedVertex",
"BoundingPoly",
"NormalizedBoundingPoly",
"ImageBoundingPolyAnnotation",
"Polyline",
"NormalizedPolyline",
"ImagePolylineAnnotation",
"ImageSegmentationAnnotation",
"TextClassificationAnnotation",
"TextEntityExtractionAnnotation",
"SequentialSegment",
"TimeSegment",
"VideoClassificationAnnotation",
"ObjectTrackingFrame",
"VideoObjectTrackingAnnotation",
"VideoEventAnnotation",
"AnnotationMetadata",
"OperatorMetadata",
"ImagePayload",
"TextPayload",
"VideoThumbnail",
"VideoPayload",
"HumanAnnotationConfig",
"ImageClassificationConfig",
"BoundingPolyConfig",
"PolylineConfig",
"SegmentationConfig",
"VideoClassificationConfig",
"ObjectDetectionConfig",
"ObjectTrackingConfig",
"EventConfig",
"TextClassificationConfig",
"SentimentConfig",
"TextEntityExtractionConfig",
"Dataset",
"InputConfig",
"TextMetadata",
"ClassificationMetadata",
"GcsSource",
"BigQuerySource",
"OutputConfig",
"GcsDestination",
"GcsFolderDestination",
"DataItem",
"AnnotatedDataset",
"LabelStats",
"AnnotatedDatasetMetadata",
"Example",
"Evaluation",
"EvaluationConfig",
"BoundingBoxEvaluationOptions",
"EvaluationMetrics",
"ClassificationMetrics",
"ObjectDetectionMetrics",
"PrCurve",
"ConfusionMatrix",
"EvaluationJob",
"EvaluationJobConfig",
"EvaluationJobAlertConfig",
"Attempt",
"Instruction",
"CsvInstruction",
"PdfInstruction",
"CreateDatasetRequest",
"GetDatasetRequest",
"ListDatasetsRequest",
"ListDatasetsResponse",
"DeleteDatasetRequest",
"ImportDataRequest",
"ExportDataRequest",
"GetDataItemRequest",
"ListDataItemsRequest",
"ListDataItemsResponse",
"GetAnnotatedDatasetRequest",
"ListAnnotatedDatasetsRequest",
"ListAnnotatedDatasetsResponse",
"DeleteAnnotatedDatasetRequest",
"LabelImageRequest",
"LabelVideoRequest",
"LabelTextRequest",
"GetExampleRequest",
"ListExamplesRequest",
"ListExamplesResponse",
"CreateAnnotationSpecSetRequest",
"GetAnnotationSpecSetRequest",
"ListAnnotationSpecSetsRequest",
"ListAnnotationSpecSetsResponse",
"DeleteAnnotationSpecSetRequest",
"CreateInstructionRequest",
"GetInstructionRequest",
"DeleteInstructionRequest",
"ListInstructionsRequest",
"ListInstructionsResponse",
"GetEvaluationRequest",
"SearchEvaluationsRequest",
"SearchEvaluationsResponse",
"SearchExampleComparisonsRequest",
"SearchExampleComparisonsResponse",
"CreateEvaluationJobRequest",
"UpdateEvaluationJobRequest",
"GetEvaluationJobRequest",
"PauseEvaluationJobRequest",
"ResumeEvaluationJobRequest",
"DeleteEvaluationJobRequest",
"ListEvaluationJobsRequest",
"ListEvaluationJobsResponse",
"ImportDataOperationResponse",
"ExportDataOperationResponse",
"ImportDataOperationMetadata",
"ExportDataOperationMetadata",
"LabelOperationMetadata",
"LabelImageClassificationOperationMetadata",
"LabelImageBoundingBoxOperationMetadata",
"LabelImageOrientedBoundingBoxOperationMetadata",
"LabelImageBoundingPolyOperationMetadata",
"LabelImagePolylineOperationMetadata",
"LabelImageSegmentationOperationMetadata",
"LabelVideoClassificationOperationMetadata",
"LabelVideoObjectDetectionOperationMetadata",
"LabelVideoObjectTrackingOperationMetadata",
"LabelVideoEventOperationMetadata",
"LabelTextClassificationOperationMetadata",
"LabelTextEntityExtractionOperationMetadata",
"CreateInstructionMetadata",
)
588 changes: 588 additions & 0 deletions google/cloud/datalabeling_v1beta1/types/annotation.py

Large diffs are not rendered by default.

92 changes: 92 additions & 0 deletions google/cloud/datalabeling_v1beta1/types/annotation_spec_set.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

import proto # type: ignore


__protobuf__ = proto.module(
package="google.cloud.datalabeling.v1beta1",
manifest={"AnnotationSpecSet", "AnnotationSpec",},
)


class AnnotationSpecSet(proto.Message):
r"""An AnnotationSpecSet is a collection of label definitions.
For example, in image classification tasks, you define a set of
possible labels for images as an AnnotationSpecSet. An
AnnotationSpecSet is immutable upon creation.
Attributes:
name (str):
Output only. The AnnotationSpecSet resource name in the
following format:
"projects/{project_id}/annotationSpecSets/{annotation_spec_set_id}".
display_name (str):
Required. The display name for
AnnotationSpecSet that you define when you
create it. Maximum of 64 characters.
description (str):
Optional. User-provided description of the
annotation specification set. The description
can be up to 10,000 characters long.
annotation_specs (Sequence[~.annotation_spec_set.AnnotationSpec]):
Required. The array of AnnotationSpecs that
you define when you create the
AnnotationSpecSet. These are the possible labels
for the labeling task.
blocking_resources (Sequence[str]):
Output only. The names of any related
resources that are blocking changes to the
annotation spec set.
"""

name = proto.Field(proto.STRING, number=1)

display_name = proto.Field(proto.STRING, number=2)

description = proto.Field(proto.STRING, number=3)

annotation_specs = proto.RepeatedField(
proto.MESSAGE, number=4, message="AnnotationSpec",
)

blocking_resources = proto.RepeatedField(proto.STRING, number=5)


class AnnotationSpec(proto.Message):
r"""Container of information related to one possible annotation that can
be used in a labeling task. For example, an image classification
task where images are labeled as ``dog`` or ``cat`` must reference
an AnnotationSpec for ``dog`` and an AnnotationSpec for ``cat``.
Attributes:
display_name (str):
Required. The display name of the
AnnotationSpec. Maximum of 64 characters.
description (str):
Optional. User-provided description of the
annotation specification. The description can be
up to 10,000 characters long.
"""

display_name = proto.Field(proto.STRING, number=1)

description = proto.Field(proto.STRING, number=2)


__all__ = tuple(sorted(__protobuf__.manifest))
1,168 changes: 1,168 additions & 0 deletions google/cloud/datalabeling_v1beta1/types/data_labeling_service.py

Large diffs are not rendered by default.

112 changes: 112 additions & 0 deletions google/cloud/datalabeling_v1beta1/types/data_payloads.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,112 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

import proto # type: ignore


from google.protobuf import duration_pb2 as duration # type: ignore


__protobuf__ = proto.module(
package="google.cloud.datalabeling.v1beta1",
manifest={"ImagePayload", "TextPayload", "VideoThumbnail", "VideoPayload",},
)


class ImagePayload(proto.Message):
r"""Container of information about an image.
Attributes:
mime_type (str):
Image format.
image_thumbnail (bytes):
A byte string of a thumbnail image.
image_uri (str):
Image uri from the user bucket.
signed_uri (str):
Signed uri of the image file in the service
bucket.
"""

mime_type = proto.Field(proto.STRING, number=1)

image_thumbnail = proto.Field(proto.BYTES, number=2)

image_uri = proto.Field(proto.STRING, number=3)

signed_uri = proto.Field(proto.STRING, number=4)


class TextPayload(proto.Message):
r"""Container of information about a piece of text.
Attributes:
text_content (str):
Text content.
"""

text_content = proto.Field(proto.STRING, number=1)


class VideoThumbnail(proto.Message):
r"""Container of information of a video thumbnail.
Attributes:
thumbnail (bytes):
A byte string of the video frame.
time_offset (~.duration.Duration):
Time offset relative to the beginning of the
video, corresponding to the video frame where
the thumbnail has been extracted from.
"""

thumbnail = proto.Field(proto.BYTES, number=1)

time_offset = proto.Field(proto.MESSAGE, number=2, message=duration.Duration,)


class VideoPayload(proto.Message):
r"""Container of information of a video.
Attributes:
mime_type (str):
Video format.
video_uri (str):
Video uri from the user bucket.
video_thumbnails (Sequence[~.data_payloads.VideoThumbnail]):
The list of video thumbnails.
frame_rate (float):
FPS of the video.
signed_uri (str):
Signed uri of the video file in the service
bucket.
"""

mime_type = proto.Field(proto.STRING, number=1)

video_uri = proto.Field(proto.STRING, number=2)

video_thumbnails = proto.RepeatedField(
proto.MESSAGE, number=3, message=VideoThumbnail,
)

frame_rate = proto.Field(proto.FLOAT, number=4)

signed_uri = proto.Field(proto.STRING, number=5)


__all__ = tuple(sorted(__protobuf__.manifest))
555 changes: 555 additions & 0 deletions google/cloud/datalabeling_v1beta1/types/dataset.py

Large diffs are not rendered by default.

334 changes: 334 additions & 0 deletions google/cloud/datalabeling_v1beta1/types/evaluation.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,334 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

import proto # type: ignore


from google.cloud.datalabeling_v1beta1.types import annotation
from google.cloud.datalabeling_v1beta1.types import annotation_spec_set
from google.protobuf import timestamp_pb2 as timestamp # type: ignore


__protobuf__ = proto.module(
package="google.cloud.datalabeling.v1beta1",
manifest={
"Evaluation",
"EvaluationConfig",
"BoundingBoxEvaluationOptions",
"EvaluationMetrics",
"ClassificationMetrics",
"ObjectDetectionMetrics",
"PrCurve",
"ConfusionMatrix",
},
)


class Evaluation(proto.Message):
r"""Describes an evaluation between a machine learning model's
predictions and ground truth labels. Created when an
[EvaluationJob][google.cloud.datalabeling.v1beta1.EvaluationJob]
runs successfully.
Attributes:
name (str):
Output only. Resource name of an evaluation. The name has
the following format:
"projects/{project_id}/datasets/{dataset_id}/evaluations/{evaluation_id}'
config (~.evaluation.EvaluationConfig):
Output only. Options used in the evaluation
job that created this evaluation.
evaluation_job_run_time (~.timestamp.Timestamp):
Output only. Timestamp for when the
evaluation job that created this evaluation ran.
create_time (~.timestamp.Timestamp):
Output only. Timestamp for when this
evaluation was created.
evaluation_metrics (~.evaluation.EvaluationMetrics):
Output only. Metrics comparing predictions to
ground truth labels.
annotation_type (~.annotation.AnnotationType):
Output only. Type of task that the model version being
evaluated performs, as defined in the
[evaluationJobConfig.inputConfig.annotationType][google.cloud.datalabeling.v1beta1.EvaluationJobConfig.input_config]
field of the evaluation job that created this evaluation.
evaluated_item_count (int):
Output only. The number of items in the
ground truth dataset that were used for this
evaluation. Only populated when the evaulation
is for certain AnnotationTypes.
"""

name = proto.Field(proto.STRING, number=1)

config = proto.Field(proto.MESSAGE, number=2, message="EvaluationConfig",)

evaluation_job_run_time = proto.Field(
proto.MESSAGE, number=3, message=timestamp.Timestamp,
)

create_time = proto.Field(proto.MESSAGE, number=4, message=timestamp.Timestamp,)

evaluation_metrics = proto.Field(
proto.MESSAGE, number=5, message="EvaluationMetrics",
)

annotation_type = proto.Field(proto.ENUM, number=6, enum=annotation.AnnotationType,)

evaluated_item_count = proto.Field(proto.INT64, number=7)


class EvaluationConfig(proto.Message):
r"""Configuration details used for calculating evaluation metrics and
creating an
[Evaluation][google.cloud.datalabeling.v1beta1.Evaluation].
Attributes:
bounding_box_evaluation_options (~.evaluation.BoundingBoxEvaluationOptions):
Only specify this field if the related model performs image
object detection (``IMAGE_BOUNDING_BOX_ANNOTATION``).
Describes how to evaluate bounding boxes.
"""

bounding_box_evaluation_options = proto.Field(
proto.MESSAGE,
number=1,
oneof="vertical_option",
message="BoundingBoxEvaluationOptions",
)


class BoundingBoxEvaluationOptions(proto.Message):
r"""Options regarding evaluation between bounding boxes.
Attributes:
iou_threshold (float):
Minimum [intersection-over-union
(IOU)](/vision/automl/object-detection/docs/evaluate#intersection-over-union)
required for 2 bounding boxes to be considered a match. This
must be a number between 0 and 1.
"""

iou_threshold = proto.Field(proto.FLOAT, number=1)


class EvaluationMetrics(proto.Message):
r"""
Attributes:
classification_metrics (~.evaluation.ClassificationMetrics):
object_detection_metrics (~.evaluation.ObjectDetectionMetrics):
"""

classification_metrics = proto.Field(
proto.MESSAGE, number=1, oneof="metrics", message="ClassificationMetrics",
)

object_detection_metrics = proto.Field(
proto.MESSAGE, number=2, oneof="metrics", message="ObjectDetectionMetrics",
)


class ClassificationMetrics(proto.Message):
r"""Metrics calculated for a classification model.
Attributes:
pr_curve (~.evaluation.PrCurve):
Precision-recall curve based on ground truth
labels, predicted labels, and scores for the
predicted labels.
confusion_matrix (~.evaluation.ConfusionMatrix):
Confusion matrix of predicted labels vs.
ground truth labels.
"""

pr_curve = proto.Field(proto.MESSAGE, number=1, message="PrCurve",)

confusion_matrix = proto.Field(proto.MESSAGE, number=2, message="ConfusionMatrix",)


class ObjectDetectionMetrics(proto.Message):
r"""Metrics calculated for an image object detection (bounding
box) model.
Attributes:
pr_curve (~.evaluation.PrCurve):
Precision-recall curve.
"""

pr_curve = proto.Field(proto.MESSAGE, number=1, message="PrCurve",)


class PrCurve(proto.Message):
r"""
Attributes:
annotation_spec (~.annotation_spec_set.AnnotationSpec):
The annotation spec of the label for which
the precision-recall curve calculated. If this
field is empty, that means the precision-recall
curve is an aggregate curve for all labels.
area_under_curve (float):
Area under the precision-recall curve. Not to
be confused with area under a receiver operating
characteristic (ROC) curve.
confidence_metrics_entries (Sequence[~.evaluation.PrCurve.ConfidenceMetricsEntry]):
Entries that make up the precision-recall graph. Each entry
is a "point" on the graph drawn for a different
``confidence_threshold``.
mean_average_precision (float):
Mean average prcision of this curve.
"""

class ConfidenceMetricsEntry(proto.Message):
r"""
Attributes:
confidence_threshold (float):
Threshold used for this entry.
For classification tasks, this is a classification
threshold: a predicted label is categorized as positive or
negative (in the context of this point on the PR curve)
based on whether the label's score meets this threshold.
For image object detection (bounding box) tasks, this is the
[intersection-over-union
(IOU)](/vision/automl/object-detection/docs/evaluate#intersection-over-union)
threshold for the context of this point on the PR curve.
recall (float):
Recall value.
precision (float):
Precision value.
f1_score (float):
Harmonic mean of recall and precision.
recall_at1 (float):
Recall value for entries with label that has
highest score.
precision_at1 (float):
Precision value for entries with label that
has highest score.
f1_score_at1 (float):
The harmonic mean of
[recall_at1][google.cloud.datalabeling.v1beta1.PrCurve.ConfidenceMetricsEntry.recall_at1]
and
[precision_at1][google.cloud.datalabeling.v1beta1.PrCurve.ConfidenceMetricsEntry.precision_at1].
recall_at5 (float):
Recall value for entries with label that has
highest 5 scores.
precision_at5 (float):
Precision value for entries with label that
has highest 5 scores.
f1_score_at5 (float):
The harmonic mean of
[recall_at5][google.cloud.datalabeling.v1beta1.PrCurve.ConfidenceMetricsEntry.recall_at5]
and
[precision_at5][google.cloud.datalabeling.v1beta1.PrCurve.ConfidenceMetricsEntry.precision_at5].
"""

confidence_threshold = proto.Field(proto.FLOAT, number=1)

recall = proto.Field(proto.FLOAT, number=2)

precision = proto.Field(proto.FLOAT, number=3)

f1_score = proto.Field(proto.FLOAT, number=4)

recall_at1 = proto.Field(proto.FLOAT, number=5)

precision_at1 = proto.Field(proto.FLOAT, number=6)

f1_score_at1 = proto.Field(proto.FLOAT, number=7)

recall_at5 = proto.Field(proto.FLOAT, number=8)

precision_at5 = proto.Field(proto.FLOAT, number=9)

f1_score_at5 = proto.Field(proto.FLOAT, number=10)

annotation_spec = proto.Field(
proto.MESSAGE, number=1, message=annotation_spec_set.AnnotationSpec,
)

area_under_curve = proto.Field(proto.FLOAT, number=2)

confidence_metrics_entries = proto.RepeatedField(
proto.MESSAGE, number=3, message=ConfidenceMetricsEntry,
)

mean_average_precision = proto.Field(proto.FLOAT, number=4)


class ConfusionMatrix(proto.Message):
r"""Confusion matrix of the model running the classification.
Only applicable when the metrics entry aggregates multiple
labels. Not applicable when the entry is for a single label.
Attributes:
row (Sequence[~.evaluation.ConfusionMatrix.Row]):
"""

class ConfusionMatrixEntry(proto.Message):
r"""
Attributes:
annotation_spec (~.annotation_spec_set.AnnotationSpec):
The annotation spec of a predicted label.
item_count (int):
Number of items predicted to have this label. (The ground
truth label for these items is the ``Row.annotationSpec`` of
this entry's parent.)
"""

annotation_spec = proto.Field(
proto.MESSAGE, number=1, message=annotation_spec_set.AnnotationSpec,
)

item_count = proto.Field(proto.INT32, number=2)

class Row(proto.Message):
r"""A row in the confusion matrix. Each entry in this row has the
same ground truth label.
Attributes:
annotation_spec (~.annotation_spec_set.AnnotationSpec):
The annotation spec of the ground truth label
for this row.
entries (Sequence[~.evaluation.ConfusionMatrix.ConfusionMatrixEntry]):
A list of the confusion matrix entries. One
entry for each possible predicted label.
"""

annotation_spec = proto.Field(
proto.MESSAGE, number=1, message=annotation_spec_set.AnnotationSpec,
)

entries = proto.RepeatedField(
proto.MESSAGE, number=2, message="ConfusionMatrix.ConfusionMatrixEntry",
)

row = proto.RepeatedField(proto.MESSAGE, number=1, message=Row,)


__all__ = tuple(sorted(__protobuf__.manifest))
339 changes: 339 additions & 0 deletions google/cloud/datalabeling_v1beta1/types/evaluation_job.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,339 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

import proto # type: ignore


from google.cloud.datalabeling_v1beta1.types import dataset
from google.cloud.datalabeling_v1beta1.types import evaluation
from google.cloud.datalabeling_v1beta1.types import (
human_annotation_config as gcd_human_annotation_config,
)
from google.protobuf import timestamp_pb2 as timestamp # type: ignore
from google.rpc import status_pb2 as status # type: ignore


__protobuf__ = proto.module(
package="google.cloud.datalabeling.v1beta1",
manifest={
"EvaluationJob",
"EvaluationJobConfig",
"EvaluationJobAlertConfig",
"Attempt",
},
)


class EvaluationJob(proto.Message):
r"""Defines an evaluation job that runs periodically to generate
[Evaluations][google.cloud.datalabeling.v1beta1.Evaluation].
`Creating an evaluation
job </ml-engine/docs/continuous-evaluation/create-job>`__ is the
starting point for using continuous evaluation.
Attributes:
name (str):
Output only. After you create a job, Data Labeling Service
assigns a name to the job with the following format:
"projects/{project_id}/evaluationJobs/{evaluation_job_id}".
description (str):
Required. Description of the job. The
description can be up to 25,000 characters long.
state (~.evaluation_job.EvaluationJob.State):
Output only. Describes the current state of
the job.
schedule (str):
Required. Describes the interval at which the job runs. This
interval must be at least 1 day, and it is rounded to the
nearest day. For example, if you specify a 50-hour interval,
the job runs every 2 days.
You can provide the schedule in `crontab
format </scheduler/docs/configuring/cron-job-schedules>`__
or in an `English-like
format </appengine/docs/standard/python/config/cronref#schedule_format>`__.
Regardless of what you specify, the job will run at 10:00 AM
UTC. Only the interval from this schedule is used, not the
specific time of day.
model_version (str):
Required. The `AI Platform Prediction model
version </ml-engine/docs/prediction-overview>`__ to be
evaluated. Prediction input and output is sampled from this
model version. When creating an evaluation job, specify the
model version in the following format:
"projects/{project_id}/models/{model_name}/versions/{version_name}"
There can only be one evaluation job per model version.
evaluation_job_config (~.evaluation_job.EvaluationJobConfig):
Required. Configuration details for the
evaluation job.
annotation_spec_set (str):
Required. Name of the
[AnnotationSpecSet][google.cloud.datalabeling.v1beta1.AnnotationSpecSet]
describing all the labels that your machine learning model
outputs. You must create this resource before you create an
evaluation job and provide its name in the following format:
"projects/{project_id}/annotationSpecSets/{annotation_spec_set_id}".
label_missing_ground_truth (bool):
Required. Whether you want Data Labeling Service to provide
ground truth labels for prediction input. If you want the
service to assign human labelers to annotate your data, set
this to ``true``. If you want to provide your own ground
truth labels in the evaluation job's BigQuery table, set
this to ``false``.
attempts (Sequence[~.evaluation_job.Attempt]):
Output only. Every time the evaluation job
runs and an error occurs, the failed attempt is
appended to this array.
create_time (~.timestamp.Timestamp):
Output only. Timestamp of when this
evaluation job was created.
"""

class State(proto.Enum):
r"""State of the job."""
STATE_UNSPECIFIED = 0
SCHEDULED = 1
RUNNING = 2
PAUSED = 3
STOPPED = 4

name = proto.Field(proto.STRING, number=1)

description = proto.Field(proto.STRING, number=2)

state = proto.Field(proto.ENUM, number=3, enum=State,)

schedule = proto.Field(proto.STRING, number=4)

model_version = proto.Field(proto.STRING, number=5)

evaluation_job_config = proto.Field(
proto.MESSAGE, number=6, message="EvaluationJobConfig",
)

annotation_spec_set = proto.Field(proto.STRING, number=7)

label_missing_ground_truth = proto.Field(proto.BOOL, number=8)

attempts = proto.RepeatedField(proto.MESSAGE, number=9, message="Attempt",)

create_time = proto.Field(proto.MESSAGE, number=10, message=timestamp.Timestamp,)


class EvaluationJobConfig(proto.Message):
r"""Configures specific details of how a continuous evaluation
job works. Provide this configuration when you create an
EvaluationJob.
Attributes:
image_classification_config (~.gcd_human_annotation_config.ImageClassificationConfig):
Specify this field if your model version performs image
classification or general classification.
``annotationSpecSet`` in this configuration must match
[EvaluationJob.annotationSpecSet][google.cloud.datalabeling.v1beta1.EvaluationJob.annotation_spec_set].
``allowMultiLabel`` in this configuration must match
``classificationMetadata.isMultiLabel`` in
[input_config][google.cloud.datalabeling.v1beta1.EvaluationJobConfig.input_config].
bounding_poly_config (~.gcd_human_annotation_config.BoundingPolyConfig):
Specify this field if your model version performs image
object detection (bounding box detection).
``annotationSpecSet`` in this configuration must match
[EvaluationJob.annotationSpecSet][google.cloud.datalabeling.v1beta1.EvaluationJob.annotation_spec_set].
text_classification_config (~.gcd_human_annotation_config.TextClassificationConfig):
Specify this field if your model version performs text
classification.
``annotationSpecSet`` in this configuration must match
[EvaluationJob.annotationSpecSet][google.cloud.datalabeling.v1beta1.EvaluationJob.annotation_spec_set].
``allowMultiLabel`` in this configuration must match
``classificationMetadata.isMultiLabel`` in
[input_config][google.cloud.datalabeling.v1beta1.EvaluationJobConfig.input_config].
input_config (~.dataset.InputConfig):
Rquired. Details for the sampled prediction input. Within
this configuration, there are requirements for several
fields:
- ``dataType`` must be one of ``IMAGE``, ``TEXT``, or
``GENERAL_DATA``.
- ``annotationType`` must be one of
``IMAGE_CLASSIFICATION_ANNOTATION``,
``TEXT_CLASSIFICATION_ANNOTATION``,
``GENERAL_CLASSIFICATION_ANNOTATION``, or
``IMAGE_BOUNDING_BOX_ANNOTATION`` (image object
detection).
- If your machine learning model performs classification,
you must specify ``classificationMetadata.isMultiLabel``.
- You must specify ``bigquerySource`` (not ``gcsSource``).
evaluation_config (~.evaluation.EvaluationConfig):
Required. Details for calculating evaluation metrics and
creating
[Evaulations][google.cloud.datalabeling.v1beta1.Evaluation].
If your model version performs image object detection, you
must specify the ``boundingBoxEvaluationOptions`` field
within this configuration. Otherwise, provide an empty
object for this configuration.
human_annotation_config (~.gcd_human_annotation_config.HumanAnnotationConfig):
Optional. Details for human annotation of your data. If you
set
[labelMissingGroundTruth][google.cloud.datalabeling.v1beta1.EvaluationJob.label_missing_ground_truth]
to ``true`` for this evaluation job, then you must specify
this field. If you plan to provide your own ground truth
labels, then omit this field.
Note that you must create an
[Instruction][google.cloud.datalabeling.v1beta1.Instruction]
resource before you can specify this field. Provide the name
of the instruction resource in the ``instruction`` field
within this configuration.
bigquery_import_keys (Sequence[~.evaluation_job.EvaluationJobConfig.BigqueryImportKeysEntry]):
Required. Prediction keys that tell Data Labeling Service
where to find the data for evaluation in your BigQuery
table. When the service samples prediction input and output
from your model version and saves it to BigQuery, the data
gets stored as JSON strings in the BigQuery table. These
keys tell Data Labeling Service how to parse the JSON.
You can provide the following entries in this field:
- ``data_json_key``: the data key for prediction input. You
must provide either this key or ``reference_json_key``.
- ``reference_json_key``: the data reference key for
prediction input. You must provide either this key or
``data_json_key``.
- ``label_json_key``: the label key for prediction output.
Required.
- ``label_score_json_key``: the score key for prediction
output. Required.
- ``bounding_box_json_key``: the bounding box key for
prediction output. Required if your model version perform
image object detection.
Learn `how to configure prediction
keys </ml-engine/docs/continuous-evaluation/create-job#prediction-keys>`__.
example_count (int):
Required. The maximum number of predictions to sample and
save to BigQuery during each [evaluation
interval][google.cloud.datalabeling.v1beta1.EvaluationJob.schedule].
This limit overrides ``example_sample_percentage``: even if
the service has not sampled enough predictions to fulfill
``example_sample_perecentage`` during an interval, it stops
sampling predictions when it meets this limit.
example_sample_percentage (float):
Required. Fraction of predictions to sample and save to
BigQuery during each [evaluation
interval][google.cloud.datalabeling.v1beta1.EvaluationJob.schedule].
For example, 0.1 means 10% of predictions served by your
model version get saved to BigQuery.
evaluation_job_alert_config (~.evaluation_job.EvaluationJobAlertConfig):
Optional. Configuration details for
evaluation job alerts. Specify this field if you
want to receive email alerts if the evaluation
job finds that your predictions have low mean
average precision during a run.
"""

image_classification_config = proto.Field(
proto.MESSAGE,
number=4,
oneof="human_annotation_request_config",
message=gcd_human_annotation_config.ImageClassificationConfig,
)

bounding_poly_config = proto.Field(
proto.MESSAGE,
number=5,
oneof="human_annotation_request_config",
message=gcd_human_annotation_config.BoundingPolyConfig,
)

text_classification_config = proto.Field(
proto.MESSAGE,
number=8,
oneof="human_annotation_request_config",
message=gcd_human_annotation_config.TextClassificationConfig,
)

input_config = proto.Field(proto.MESSAGE, number=1, message=dataset.InputConfig,)

evaluation_config = proto.Field(
proto.MESSAGE, number=2, message=evaluation.EvaluationConfig,
)

human_annotation_config = proto.Field(
proto.MESSAGE,
number=3,
message=gcd_human_annotation_config.HumanAnnotationConfig,
)

bigquery_import_keys = proto.MapField(proto.STRING, proto.STRING, number=9)

example_count = proto.Field(proto.INT32, number=10)

example_sample_percentage = proto.Field(proto.DOUBLE, number=11)

evaluation_job_alert_config = proto.Field(
proto.MESSAGE, number=13, message="EvaluationJobAlertConfig",
)


class EvaluationJobAlertConfig(proto.Message):
r"""Provides details for how an evaluation job sends email alerts
based on the results of a run.
Attributes:
email (str):
Required. An email address to send alerts to.
min_acceptable_mean_average_precision (float):
Required. A number between 0 and 1 that describes a minimum
mean average precision threshold. When the evaluation job
runs, if it calculates that your model version's predictions
from the recent interval have
[meanAveragePrecision][google.cloud.datalabeling.v1beta1.PrCurve.mean_average_precision]
below this threshold, then it sends an alert to your
specified email.
"""

email = proto.Field(proto.STRING, number=1)

min_acceptable_mean_average_precision = proto.Field(proto.DOUBLE, number=2)


class Attempt(proto.Message):
r"""Records a failed evaluation job run.
Attributes:
attempt_time (~.timestamp.Timestamp):
partial_failures (Sequence[~.status.Status]):
Details of errors that occurred.
"""

attempt_time = proto.Field(proto.MESSAGE, number=1, message=timestamp.Timestamp,)

partial_failures = proto.RepeatedField(
proto.MESSAGE, number=2, message=status.Status,
)


__all__ = tuple(sorted(__protobuf__.manifest))
Loading