File tree Expand file tree Collapse file tree 2 files changed +4
-11
lines changed Expand file tree Collapse file tree 2 files changed +4
-11
lines changed Original file line number Diff line number Diff line change 44### Added
55* Added ` get_by_name() ` method to MetadataOntology object to access both custom and reserved metadata by name.
66
7+ ### Changed
8+ * ` Dataset.create_data_rows() ` max limit of DataRows increased to 150,000
9+
710# Version 3.33.1 (2022-12-14)
811### Fixed
912* Fixed where batch creation limit was still limiting # of data rows. SDK should now support creating batches with up to 100k data rows
Original file line number Diff line number Diff line change 2323
2424logger = logging .getLogger (__name__ )
2525
26- MAX_DATAROW_PER_API_OPERATION = 150000
27- MAX_DATAROW_WITH_METADATA = 30000
26+ MAX_DATAROW_PER_API_OPERATION = 150_000
2827
2928
3029class Dataset (DbObject , Updateable , Deletable ):
@@ -426,15 +425,6 @@ def convert_item(item):
426425 f"Cannot create more than { MAX_DATAROW_PER_API_OPERATION } DataRows per function call."
427426 )
428427
429- # TODO: If any datarows contain metadata, we're limiting max # of datarows
430- # until we address performance issues with datarow create with metadata
431- if len (items ) > MAX_DATAROW_WITH_METADATA :
432- for row in items :
433- if 'metadata_fields' in row :
434- raise MalformedQueryException (
435- f"Cannot create more than { MAX_DATAROW_WITH_METADATA } DataRows, if any DataRows contain metadata"
436- )
437-
438428 with ThreadPoolExecutor (file_upload_thread_count ) as executor :
439429 futures = [executor .submit (convert_item , item ) for item in items ]
440430 items = [future .result () for future in as_completed (futures )]
You can’t perform that action at this time.
0 commit comments