circleci fixes

Summary:
Misc fixes.

- most important: the mac image is gone so switch to a newer one.
- torch.concat is new; was used accidentally
- remove lpips from testing in meta.yaml as it is breaking the conda test. Better to leave the relevant tests failing in OSS.
- TypedDict usage is breaking implicitron on Python 3.7.

Reviewed By: patricklabatut

Differential Revision: D38458164

fbshipit-source-id: b16c26453a743b9a771e2a6787b9a4d2a52e41c2
This commit is contained in:
Jeremy Reizenstein 2022-08-05 08:58:17 -07:00 committed by Facebook GitHub Bot
parent 5b8a9b34a0
commit da9584357e
5 changed files with 13 additions and 8 deletions

View File

@ -159,7 +159,7 @@ jobs:
binary_macos_wheel: binary_macos_wheel:
<<: *binary_common <<: *binary_common
macos: macos:
xcode: "12.0" xcode: "13.4.1"
steps: steps:
- checkout - checkout
- run: - run:

View File

@ -159,7 +159,7 @@ jobs:
binary_macos_wheel: binary_macos_wheel:
<<: *binary_common <<: *binary_common
macos: macos:
xcode: "12.0" xcode: "13.4.1"
steps: steps:
- checkout - checkout
- run: - run:

View File

@ -47,7 +47,6 @@ test:
- imageio - imageio
- hydra-core - hydra-core
- accelerate - accelerate
- lpips
commands: commands:
#pytest . #pytest .
python -m unittest discover -v -s tests -t . python -m unittest discover -v -s tests -t .

View File

@ -199,7 +199,7 @@ class DoublePoolBatchSampler(Sampler[List[int]]):
torch.randperm(len(self.first_indices), generator=self.generator) torch.randperm(len(self.first_indices), generator=self.generator)
for _ in range(n_copies) for _ in range(n_copies)
] ]
i_first = torch.concat(raw_indices)[:num_batches] i_first = torch.cat(raw_indices)[:num_batches]
else: else:
i_first = torch.randperm(len(self.first_indices), generator=self.generator) i_first = torch.randperm(len(self.first_indices), generator=self.generator)
first_indices = [self.first_indices[i] for i in i_first] first_indices = [self.first_indices[i] for i in i_first]

View File

@ -24,7 +24,7 @@ from typing import (
Sequence, Sequence,
Tuple, Tuple,
Type, Type,
TypedDict, TYPE_CHECKING,
Union, Union,
) )
@ -45,9 +45,15 @@ from .utils import is_known_frame_scalar
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
class FrameAnnotsEntry(TypedDict): if TYPE_CHECKING:
subset: Optional[str] from typing import TypedDict
frame_annotation: types.FrameAnnotation
class FrameAnnotsEntry(TypedDict):
subset: Optional[str]
frame_annotation: types.FrameAnnotation
else:
FrameAnnotsEntry = dict
@registry.register @registry.register