Skip to content

Commit 6cedede

Browse files
Merge pull request #809 from Labelbox/kkim/test-dr-limits
[AL-4416] Increase DataRow limits to 150000
2 parents f8e612d + 45f2f1a commit 6cedede

File tree

2 files changed

+4
-11
lines changed

2 files changed

+4
-11
lines changed

CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,9 @@
44
### Added
55
* Added `get_by_name()` method to MetadataOntology object to access both custom and reserved metadata by name.
66

7+
### Changed
8+
* `Dataset.create_data_rows()` max limit of DataRows increased to 150,000
9+
710
# Version 3.33.1 (2022-12-14)
811
### Fixed
912
* Fixed where batch creation limit was still limiting # of data rows. SDK should now support creating batches with up to 100k data rows

labelbox/schema/dataset.py

Lines changed: 1 addition & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,7 @@
2323

2424
logger = logging.getLogger(__name__)
2525

26-
MAX_DATAROW_PER_API_OPERATION = 150000
27-
MAX_DATAROW_WITH_METADATA = 30000
26+
MAX_DATAROW_PER_API_OPERATION = 150_000
2827

2928

3029
class Dataset(DbObject, Updateable, Deletable):
@@ -426,15 +425,6 @@ def convert_item(item):
426425
f"Cannot create more than {MAX_DATAROW_PER_API_OPERATION} DataRows per function call."
427426
)
428427

429-
# TODO: If any datarows contain metadata, we're limiting max # of datarows
430-
# until we address performance issues with datarow create with metadata
431-
if len(items) > MAX_DATAROW_WITH_METADATA:
432-
for row in items:
433-
if 'metadata_fields' in row:
434-
raise MalformedQueryException(
435-
f"Cannot create more than {MAX_DATAROW_WITH_METADATA} DataRows, if any DataRows contain metadata"
436-
)
437-
438428
with ThreadPoolExecutor(file_upload_thread_count) as executor:
439429
futures = [executor.submit(convert_item, item) for item in items]
440430
items = [future.result() for future in as_completed(futures)]

0 commit comments

Comments
 (0)