Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compound Dataset Support for TermSetWrapper #1061

Merged
merged 53 commits into from
Apr 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
909b372
concept
mavaylon1 Mar 3, 2024
799e555
ruff
mavaylon1 Mar 3, 2024
51b5cd7
test
mavaylon1 Mar 3, 2024
3b0a6d8
clean up
mavaylon1 Mar 3, 2024
b32aeb1
doc
mavaylon1 Mar 3, 2024
81c4ad1
ruff
mavaylon1 Mar 3, 2024
3fa54c8
Update CHANGELOG.md
mavaylon1 Mar 11, 2024
db3a832
Update term_set.py
mavaylon1 Mar 11, 2024
5595573
Update term_set.py
mavaylon1 Mar 11, 2024
a54e358
Update CHANGELOG.md
mavaylon1 Mar 11, 2024
ed76f63
Update CHANGELOG.md
rly Mar 11, 2024
ac2e10c
Update CHANGELOG.md
rly Mar 11, 2024
0f5c1e7
Update docs/gallery/plot_term_set.py
rly Mar 11, 2024
71936fb
Update src/hdmf/term_set.py
mavaylon1 Mar 11, 2024
5d4fcd9
checkpoint
mavaylon1 Mar 12, 2024
a313bb9
Merge branch 'dev' into compound
mavaylon1 Mar 14, 2024
efc30f1
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 14, 2024
0964adc
tests
mavaylon1 Mar 14, 2024
86ef625
tests
mavaylon1 Mar 14, 2024
cb26e66
test
mavaylon1 Mar 14, 2024
2720a2c
Update plot_term_set.py
mavaylon1 Mar 18, 2024
b11d57d
Update term_set.py
mavaylon1 Mar 18, 2024
6281c23
Merge branch 'dev' into compound
rly Mar 19, 2024
c16c542
Update tests/unit/common/test_table.py
mavaylon1 Mar 19, 2024
01ba322
Update tests/unit/common/test_table.py
mavaylon1 Mar 19, 2024
c798509
data
mavaylon1 Mar 21, 2024
cdd8b3c
Merge branch 'dev' into compound
mavaylon1 Mar 21, 2024
ae06d8d
data
mavaylon1 Mar 21, 2024
b85eb59
method
mavaylon1 Mar 21, 2024
121ac4e
Merge branch 'dev' into compound
mavaylon1 Mar 25, 2024
f0f9e76
tests
mavaylon1 Mar 25, 2024
09ce20e
tests
mavaylon1 Mar 25, 2024
1248150
Update CHANGELOG.md
mavaylon1 Mar 26, 2024
cba22ea
Update CHANGELOG.md
mavaylon1 Mar 26, 2024
d34742a
Update data_utils.py
mavaylon1 Mar 27, 2024
48889a5
Update data_utils.py
mavaylon1 Mar 27, 2024
f025398
Update data_utils.py
mavaylon1 Mar 28, 2024
1f3f813
Update test_table.py
mavaylon1 Mar 28, 2024
c9b98c7
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 28, 2024
fc7a398
Update data_utils.py
mavaylon1 Mar 28, 2024
3e3e6bb
Update data_utils.py
mavaylon1 Mar 28, 2024
d16be50
test
mavaylon1 Mar 28, 2024
e83257d
test
mavaylon1 Mar 28, 2024
a0569db
Merge branch 'dev' into compound
mavaylon1 Mar 28, 2024
bcf554c
weird name change
mavaylon1 Mar 28, 2024
fe3779c
weird name change
mavaylon1 Mar 28, 2024
277f3ed
Update term_set.py
mavaylon1 Apr 3, 2024
0a2ef5e
Update test_table.py
mavaylon1 Apr 3, 2024
fa00832
Update term_set.py
mavaylon1 Apr 3, 2024
3232824
Update term_set.py
mavaylon1 Apr 3, 2024
6a3e1ec
Update term_set.py
mavaylon1 Apr 3, 2024
73d36c9
coverage
mavaylon1 Apr 4, 2024
da0dcd3
Update tests/unit/common/test_table.py
rly Apr 4, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 14 additions & 13 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

### Enhancements
- Added `TermSetConfigurator` to automatically wrap fields with `TermSetWrapper` according to a configuration file. @mavaylon1 [#1016](https://github.com/hdmf-dev/hdmf/pull/1016)
- Updated `TermSetWrapper` to support validating a single field within a compound array. @mavaylon1 [#1061](https://github.com/hdmf-dev/hdmf/pull/1061)

## HDMF 3.13.0 (March 20, 2024)

Expand Down Expand Up @@ -138,8 +139,8 @@ will increase the minor version number to 3.10.0. See the 3.9.1 release notes be
## HDMF 3.6.0 (May 12, 2023)

### New features and minor improvements
- Updated `ExternalResources` to have `FileTable` and new methods to query data. the `ResourceTable` has been removed along with methods relating to `Resource`. @mavaylon [#850](https://github.com/hdmf-dev/hdmf/pull/850)
- Updated hdmf-common-schema version to 1.6.0. @mavaylon [#850](https://github.com/hdmf-dev/hdmf/pull/850)
- Updated `ExternalResources` to have `FileTable` and new methods to query data. the `ResourceTable` has been removed along with methods relating to `Resource`. @mavaylon1 [#850](https://github.com/hdmf-dev/hdmf/pull/850)
- Updated hdmf-common-schema version to 1.6.0. @mavaylon1 [#850](https://github.com/hdmf-dev/hdmf/pull/850)
- Added testing of HDMF-Zarr on PR and nightly. @rly [#859](https://github.com/hdmf-dev/hdmf/pull/859)
- Replaced `setup.py` with `pyproject.toml`. @rly [#844](https://github.com/hdmf-dev/hdmf/pull/844)
- Use `ruff` instead of `flake8`. @rly [#844](https://github.com/hdmf-dev/hdmf/pull/844)
Expand All @@ -153,7 +154,7 @@ will increase the minor version number to 3.10.0. See the 3.9.1 release notes be
[#853](https://github.com/hdmf-dev/hdmf/pull/853)

### Documentation and tutorial enhancements:
- Updated `ExternalResources` how to tutorial to include the new features. @mavaylon [#850](https://github.com/hdmf-dev/hdmf/pull/850)
- Updated `ExternalResources` how to tutorial to include the new features. @mavaylon1 [#850](https://github.com/hdmf-dev/hdmf/pull/850)

## HDMF 3.5.6 (April 28, 2023)

Expand Down Expand Up @@ -193,13 +194,13 @@ will increase the minor version number to 3.10.0. See the 3.9.1 release notes be

### Bug fixes
- Fixed issue with conda CI. @rly [#823](https://github.com/hdmf-dev/hdmf/pull/823)
- Fixed issue with deprecated `pkg_resources`. @mavaylon [#822](https://github.com/hdmf-dev/hdmf/pull/822)
- Fixed `hdmf.common` deprecation warning. @mavaylon [#826]((https://github.com/hdmf-dev/hdmf/pull/826)
- Fixed issue with deprecated `pkg_resources`. @mavaylon1 [#822](https://github.com/hdmf-dev/hdmf/pull/822)
- Fixed `hdmf.common` deprecation warning. @mavaylon1 [#826]((https://github.com/hdmf-dev/hdmf/pull/826)

### Internal improvements
- A number of typos fixed and Github action running codespell to ensure that no typo sneaks in [#825](https://github.com/hdmf-dev/hdmf/pull/825) was added.
- Added additional documentation for `__fields__` in `AbstactContainer`. @mavaylon [#827](https://github.com/hdmf-dev/hdmf/pull/827)
- Updated warning message for broken links. @mavaylon [#829](https://github.com/hdmf-dev/hdmf/pull/829)
- Added additional documentation for `__fields__` in `AbstactContainer`. @mavaylon1 [#827](https://github.com/hdmf-dev/hdmf/pull/827)
- Updated warning message for broken links. @mavaylon1 [#829](https://github.com/hdmf-dev/hdmf/pull/829)

## HDMF 3.5.1 (January 26, 2023)

Expand All @@ -218,9 +219,9 @@ will increase the minor version number to 3.10.0. See the 3.9.1 release notes be
- Added ``HDMFIO.__del__`` to ensure that I/O objects are being closed on delete. @oruebel[#811](https://github.com/hdmf-dev/hdmf/pull/811)

### Minor improvements
- Added support for reading and writing `ExternalResources` to and from denormalized TSV files. @mavaylon [#799](https://github.com/hdmf-dev/hdmf/pull/799)
- Changed the name of `ExternalResources.export_to_sqlite` to `ExternalResources.to_sqlite`. @mavaylon [#799](https://github.com/hdmf-dev/hdmf/pull/799)
- Updated the tutorial for `ExternalResources`. @mavaylon [#799](https://github.com/hdmf-dev/hdmf/pull/799)
- Added support for reading and writing `ExternalResources` to and from denormalized TSV files. @mavaylon1 [#799](https://github.com/hdmf-dev/hdmf/pull/799)
- Changed the name of `ExternalResources.export_to_sqlite` to `ExternalResources.to_sqlite`. @mavaylon1 [#799](https://github.com/hdmf-dev/hdmf/pull/799)
- Updated the tutorial for `ExternalResources`. @mavaylon1 [#799](https://github.com/hdmf-dev/hdmf/pull/799)
- Added `message` argument for assert methods defined by `hdmf.testing.TestCase` to allow developers to include custom error messages with asserts. @oruebel [#812](https://github.com/hdmf-dev/hdmf/pull/812)
- Clarify the expected chunk shape behavior for `DataChunkIterator`. @oruebel [#813](https://github.com/hdmf-dev/hdmf/pull/813)

Expand Down Expand Up @@ -361,7 +362,7 @@ the fields (i.e., when the constructor sets some fields to fixed values). @rly
- Plotted results in external resources tutorial. @oruebel (#667)
- Added support for Python 3.10. @rly (#679)
- Updated requirements. @rly @TheChymera (#681)
- Improved testing for `ExternalResources`. @mavaylon (#673)
- Improved testing for `ExternalResources`. @mavaylon1 (#673)
- Improved docs for export. @rly (#674)
- Enhanced data chunk iteration speeds through new ``GenericDataChunkIterator`` class. @CodyCBakerPhD (#672)
- Enhanced issue template forms on GitHub. @CodyCBakerPHD (#700)
Expand Down Expand Up @@ -437,7 +438,7 @@ the fields (i.e., when the constructor sets some fields to fixed values). @rly
- Allow passing ``index=True`` to ``DynamicTable.to_dataframe()`` to support returning `DynamicTableRegion` columns
as indices or Pandas DataFrame. @rly (#579)
- Improve ``DynamicTable`` documentation. @rly (#639)
- Updated external resources tutorial. @mavaylon (#611)
- Updated external resources tutorial. @mavaylon1 (#611)

### Breaking changes and deprecations
- Previously, when using ``DynamicTable.__getitem__`` or ``DynamicTable.get`` to access a selection of a
Expand Down Expand Up @@ -522,7 +523,7 @@ the fields (i.e., when the constructor sets some fields to fixed values). @rly
- Add experimental namespace to HDMF common schema. New data types should go in the experimental namespace
(hdmf-experimental) prior to being added to the core (hdmf-common) namespace. The purpose of this is to provide
a place to test new data types that may break backward compatibility as they are refined. @ajtritt (#545)
- `ExternalResources` was changed to support storing both names and URIs for resources. @mavaylon (#517, #548)
- `ExternalResources` was changed to support storing both names and URIs for resources. @mavaylon1 (#517, #548)
- The `VocabData` data type was replaced by `EnumData` to provide more flexible support for data from a set of
fixed values.
- Added `AlignedDynamicTable`, which defines a `DynamicTable` that supports storing a collection of sub-tables.
Expand Down
14 changes: 14 additions & 0 deletions docs/gallery/plot_term_set.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@
"""
from hdmf.common import DynamicTable, VectorData
import os
import numpy as np

try:
import linkml_runtime # noqa: F401
Expand Down Expand Up @@ -129,6 +130,19 @@
data=TermSetWrapper(value=['Homo sapiens'], termset=terms)
)

######################################################
# Validate Compound Data with TermSetWrapper
# ----------------------------------------------------
# :py:class:`~hdmf.term_set.TermSetWrapper` can be wrapped around compound data.
# The user will set the field within the compound data type that is to be validated
# with the termset.
c_data = np.array([('Homo sapiens', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
data = VectorData(
name='species',
description='...',
data=TermSetWrapper(value=c_data, termset=terms, field='species')
)

######################################################
# Validate Attributes with TermSetWrapper
# ----------------------------------------------------
Expand Down
5 changes: 4 additions & 1 deletion src/hdmf/data_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,10 @@ def append_data(data, arg):
data.append(arg)
return data
elif isinstance(data, np.ndarray):
return np.append(data, np.expand_dims(arg, axis=0), axis=0)
if len(data.dtype)>0: # data is a structured array
return np.append(data, arg)
else: # arg is a scalar or row vector
return np.append(data, np.expand_dims(arg, axis=0), axis=0)
elif isinstance(data, h5py.Dataset):
shape = list(data.shape)
shape[0] += 1
Expand Down
64 changes: 52 additions & 12 deletions src/hdmf/term_set.py
Original file line number Diff line number Diff line change
Expand Up @@ -216,19 +216,26 @@
{'name': 'value',
'type': (list, np.ndarray, dict, str, tuple),
'doc': 'The target item that is wrapped, either data or attribute.'},
{'name': 'field', 'type': str, 'default': None,
'doc': 'The field within a compound array.'}
)
def __init__(self, **kwargs):
self.__value = kwargs['value']
self.__termset = kwargs['termset']
self.__field = kwargs['field']

Check warning on line 225 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L225

Added line #L225 was not covered by tests
self.__validate()

def __validate(self):
# check if list, tuple, array
if isinstance(self.__value, (list, np.ndarray, tuple)): # TODO: Future ticket on DataIO support
values = self.__value
# create list if none of those -> mostly for attributes
if self.__field is not None:
values = self.__value[self.__field]

Check warning on line 230 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L230

Added line #L230 was not covered by tests
else:
values = [self.__value]
# check if list, tuple, array
if isinstance(self.__value, (list, np.ndarray, tuple)):
values = self.__value

Check warning on line 234 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L234

Added line #L234 was not covered by tests
# create list if none of those -> mostly for scalar attributes
else:
values = [self.__value]

Check warning on line 237 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L237

Added line #L237 was not covered by tests

# iteratively validate
bad_values = []
for term in values:
Expand All @@ -243,6 +250,10 @@
def value(self):
return self.__value

@property
def field(self):
return self.__field

Check warning on line 255 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L255

Added line #L255 was not covered by tests

@property
def termset(self):
return self.__termset
Expand Down Expand Up @@ -273,26 +284,55 @@
"""
return self.__value.__iter__()

def __multi_validation(self, data):
"""
append_data includes numpy arrays. This is not the same as list append.
Numpy array append is essentially list extend. Now if a user appends an array (for compound data), we need to
support validating arrays with multiple items. This method is an internal bulk validation
check for numpy arrays and extend.
"""
bad_values = []

Check warning on line 294 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L294

Added line #L294 was not covered by tests
for item in data:
if not self.termset.validate(term=item):
bad_values.append(item)
return bad_values

Check warning on line 298 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L297-L298

Added lines #L297 - L298 were not covered by tests

def append(self, arg):
"""
This append resolves the wrapper to use the append of the container using
the wrapper.
"""
if self.termset.validate(term=arg):
self.__value = append_data(self.__value, arg)
if isinstance(arg, np.ndarray):
if self.__field is not None: # compound array
values = arg[self.__field]

Check warning on line 307 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L307

Added line #L307 was not covered by tests
else:
msg = "Array needs to be a structured array with compound dtype. If this does not apply, use extend."
raise ValueError(msg)

Check warning on line 310 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L309-L310

Added lines #L309 - L310 were not covered by tests
else:
msg = ('"%s" is not in the term set.' % arg)
values = [arg]

Check warning on line 312 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L312

Added line #L312 was not covered by tests

bad_values = self.__multi_validation(values)

Check warning on line 314 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L314

Added line #L314 was not covered by tests

if len(bad_values)!=0:
msg = ('"%s" is not in the term set.' % ', '.join([str(value) for value in bad_values]))
raise ValueError(msg)

self.__value = append_data(self.__value, arg)

Check warning on line 320 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L320

Added line #L320 was not covered by tests

def extend(self, arg):
"""
This append resolves the wrapper to use the extend of the container using
the wrapper.
"""
bad_data = []
for item in arg:
if not self.termset.validate(term=item):
bad_data.append(item)
if isinstance(arg, np.ndarray):
if self.__field is not None: # compound array
values = arg[self.__field]

Check warning on line 329 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L329

Added line #L329 was not covered by tests
else:
values = arg

Check warning on line 331 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L331

Added line #L331 was not covered by tests
else:
values = arg

Check warning on line 333 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L333

Added line #L333 was not covered by tests

bad_data = self.__multi_validation(values)

Check warning on line 335 in src/hdmf/term_set.py

View check run for this annotation

Codecov / codecov/patch

src/hdmf/term_set.py#L335

Added line #L335 was not covered by tests

if len(bad_data)==0:
self.__value = extend_data(self.__value, arg)
Expand Down
95 changes: 95 additions & 0 deletions tests/unit/common/test_table.py
Original file line number Diff line number Diff line change
Expand Up @@ -220,6 +220,101 @@ def test_add_row_validate_bad_data_all_col(self):
with self.assertRaises(ValueError):
species.add_row(Species_1='bad data', Species_2='bad data')

def test_compound_data_append(self):
c_data = np.array([('Homo sapiens', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
c_data2 = np.array([('Mus musculus', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
compound_vector_data = VectorData(
name='Species_1',
description='...',
data=c_data
)
compound_vector_data.append(c_data2)

np.testing.assert_array_equal(compound_vector_data.data, np.append(c_data, c_data2))

@unittest.skipIf(not REQUIREMENTS_INSTALLED, "optional LinkML module is not installed")
def test_array_append_error(self):
c_data = np.array(['Homo sapiens'])
c_data2 = np.array(['Mus musculus'])

terms = TermSet(term_schema_path='tests/unit/example_test_term_set.yaml')
vectordata_termset = VectorData(
name='Species_1',
description='...',
data=TermSetWrapper(value=c_data, termset=terms)
)

with self.assertRaises(ValueError):
vectordata_termset.append(c_data2)

def test_compound_data_extend(self):
c_data = np.array([('Homo sapiens', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
c_data2 = np.array([('Mus musculus', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
compound_vector_data = VectorData(
name='Species_1',
description='...',
data=c_data
)
compound_vector_data.extend(c_data2)

np.testing.assert_array_equal(compound_vector_data.data, np.vstack((c_data, c_data2)))

@unittest.skipIf(not REQUIREMENTS_INSTALLED, "optional LinkML module is not installed")
def test_add_ref_wrapped_array_append(self):
data = np.array(['Homo sapiens'])
data2 = 'Mus musculus'
terms = TermSet(term_schema_path='tests/unit/example_test_term_set.yaml')
vector_data = VectorData(
name='Species_1',
description='...',
data=TermSetWrapper(value=data, termset=terms)
)
vector_data.append(data2)

np.testing.assert_array_equal(vector_data.data.data, np.append(data, data2))

@unittest.skipIf(not REQUIREMENTS_INSTALLED, "optional LinkML module is not installed")
def test_add_ref_wrapped_array_extend(self):
data = np.array(['Homo sapiens'])
data2 = np.array(['Mus musculus'])
terms = TermSet(term_schema_path='tests/unit/example_test_term_set.yaml')
vector_data = VectorData(
name='Species_1',
description='...',
data=TermSetWrapper(value=data, termset=terms)
)
vector_data.extend(data2)

np.testing.assert_array_equal(vector_data.data.data, np.vstack((data, data2)))

@unittest.skipIf(not REQUIREMENTS_INSTALLED, "optional LinkML module is not installed")
def test_add_ref_wrapped_compound_data_append(self):
c_data = np.array([('Homo sapiens', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
c_data2 = np.array([('Mus musculus', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
terms = TermSet(term_schema_path='tests/unit/example_test_term_set.yaml')
compound_vector_data = VectorData(
name='Species_1',
description='...',
data=TermSetWrapper(value=c_data, field='species', termset=terms)
)
compound_vector_data.append(c_data2)

np.testing.assert_array_equal(compound_vector_data.data.data, np.append(c_data, c_data2))

@unittest.skipIf(not REQUIREMENTS_INSTALLED, "optional LinkML module is not installed")
def test_add_ref_wrapped_compound_data_extend(self):
c_data = np.array([('Homo sapiens', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
c_data2 = np.array([('Mus musculus', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
terms = TermSet(term_schema_path='tests/unit/example_test_term_set.yaml')
compound_vector_data = VectorData(
name='Species_1',
description='...',
data=TermSetWrapper(value=c_data, field='species', termset=terms)
)
compound_vector_data.extend(c_data2)

np.testing.assert_array_equal(compound_vector_data.data.data, np.vstack((c_data, c_data2)))

def test_constructor_bad_columns(self):
columns = ['bad_column']
msg = "'columns' must be a list of dict, VectorData, DynamicTableRegion, or VectorIndex"
Expand Down
11 changes: 6 additions & 5 deletions tests/unit/test_term_set.py
Original file line number Diff line number Diff line change
Expand Up @@ -155,21 +155,22 @@ def setUp(self):
self.wrapped_array = TermSetWrapper(value=np.array(['Homo sapiens']), termset=self.termset)
self.wrapped_list = TermSetWrapper(value=['Homo sapiens'], termset=self.termset)

c_data = np.array([('Homo sapiens', 24)], dtype=[('species', 'U50'), ('age', 'i4')])
self.wrapped_comp_array = TermSetWrapper(value=c_data,
termset=self.termset,
field='species')

self.np_data = VectorData(
name='Species_1',
description='...',
data=self.wrapped_array
)
self.list_data = VectorData(
name='Species_1',
description='...',
data=self.wrapped_list
)

def test_properties(self):
self.assertEqual(self.wrapped_array.value, ['Homo sapiens'])
self.assertEqual(self.wrapped_array.termset.view_set, self.termset.view_set)
self.assertEqual(self.wrapped_array.dtype, 'U12') # this covers __getattr__
self.assertEqual(self.wrapped_comp_array.field, 'species')

def test_get_item(self):
self.assertEqual(self.np_data.data[0], 'Homo sapiens')
Expand Down
Loading