11 Commits

Author SHA1 Message Date
Edward Yang
f6c2ca6bfc Prepare for "Fix type-safety of torch.nn.Module instances": wave 2
Summary: See D52890934

Reviewed By: malfet, r-barnes

Differential Revision: D66245100

fbshipit-source-id: 019058106ac7eaacf29c1c55912922ea55894d23
2024-11-21 11:08:51 -08:00
Jeremy Reizenstein
e20cbe9b0e test fixes and lints
Summary:
- followup recent pyre change D63415925
- make tests remove temporary files
- weights_only=True in torch.load
- lint fixes

3 test fixes from VRehnberg in https://github.com/facebookresearch/pytorch3d/issues/1914
- imageio channels fix
- frozen decorator in test_config
- load_blobs positional

Reviewed By: MichaelRamamonjisoa

Differential Revision: D66162167

fbshipit-source-id: 7737e174691b62f1708443a4fae07343cec5bfeb
2024-11-20 09:15:51 -08:00
Conner Nilsen
a27755db41 Pyre Configurationless migration for] [batch:85/112] [shard:6/N]
Reviewed By: inseokhwang

Differential Revision: D54438157

fbshipit-source-id: a6acfe146ed29fff82123b5e458906d4b4cee6a2
2024-03-04 18:30:37 -08:00
generatedunixname89002005307016
f74fc450e8 suppress errors in vision/fair/pytorch3d
Differential Revision: D51645956

fbshipit-source-id: 1ae7279efa0a27bb9bc5255527bafebb84fdafd0
2023-11-28 19:10:06 -08:00
David Novotny
35f8cb9430 Downgrade "Assigning param_group " msg to DEBUG
Summary: <See title>

Reviewed By: bottler

Differential Revision: D41534524

fbshipit-source-id: 9c39198b9b8d5fc95f857b03ad39bfe0bd720cbb
2022-11-28 02:58:15 -08:00
Jeremy Reizenstein
7be49bf46f allow dots in param_groups
Summary:
Allow a module's param_group member to specify overrides to the param groups of its members or their members.
Also logging for param group assignments.

This allows defining `params.basis_matrix` in the param_groups of a voxel_grid.

Reviewed By: shapovalov

Differential Revision: D41080667

fbshipit-source-id: 49f3b0e5b36e496f78701db0699cbb8a7e20c51e
2022-11-07 06:41:40 -08:00
Jeremy Reizenstein
fe5bdb2fb5 different learning rate for different parts
Summary:
Adds the ability to have different learning rates for different parts of the model. The trainable parts of the implicitron have a new member

       param_groups: dictionary where keys are names of individual parameters,
            or module’s members and values are the parameter group where the
            parameter/member will be sorted to. "self" key is used to denote the
            parameter group at the module level. Possible keys, including the "self" key
            do not have to be defined. By default all parameters are put into "default"
            parameter group and have the learning rate defined in the optimizer,
            it can be overriden at the:
                - module level with “self” key, all the parameters and child
                    module s parameters will be put to that parameter group
                - member level, which is the same as if the `param_groups` in that
                    member has key=“self” and value equal to that parameter group.
                    This is useful if members do not have `param_groups`, for
                    example torch.nn.Linear.
                - parameter level, parameter with the same name as the key
                    will be put to that parameter group.

And in the optimizer factory, parameters and their learning rates are recursively gathered.

Reviewed By: shapovalov

Differential Revision: D40145802

fbshipit-source-id: 631c02b8d79ee1c0eb4c31e6e42dbd3d2882078a
2022-10-18 15:58:18 -07:00
Jeremy Reizenstein
209c160a20 foreach optimizers
Summary: Allow using the new `foreach` option on optimizers.

Reviewed By: shapovalov

Differential Revision: D39694843

fbshipit-source-id: 97109c245b669bc6edff0f246893f95b7ae71f90
2022-09-22 05:11:56 -07:00
Jeremy Reizenstein
a39cad40f4 LinearExponential LR
Summary: Linear followed by exponential LR progression. Needed for making Blender scenes converge.

Reviewed By: kjchalup

Differential Revision: D38557007

fbshipit-source-id: ad630dbc5b8fabcb33eeb5bdeed5e4f31360bac2
2022-08-09 18:18:46 -07:00
David Novotny
c3f8dad55c Move load_stats to TrainingLoop
Summary:
Stats are logically connected to the training loop, not to the model. Hence, moving to the training loop.

Also removing resume_epoch from OptimizerFactory in favor of a single place - ModelFactory. This removes the need for config consistency checks etc.

Reviewed By: kjchalup

Differential Revision: D38313475

fbshipit-source-id: a1d188a63e28459df381ff98ad8acdcdb14887b7
2022-08-02 15:40:53 -07:00
Krzysztof Chalupka
1b0584f7bd Replace pluggable components to create a proper Configurable hierarchy.
Summary:
This large diff rewrites a significant portion of Implicitron's config hierarchy. The new hierarchy, and some of the default implementation classes, are as follows:
```
Experiment
    data_source: ImplicitronDataSource
        dataset_map_provider
        data_loader_map_provider
    model_factory: ImplicitronModelFactory
        model: GenericModel
    optimizer_factory: ImplicitronOptimizerFactory
    training_loop: ImplicitronTrainingLoop
        evaluator: ImplicitronEvaluator
```

1) Experiment (used to be ExperimentConfig) is now a top-level Configurable and contains as members mainly (mostly new) high-level factory Configurables.
2) Experiment's job is to run factories, do some accelerate setup and then pass the results to the main training loop.
3) ImplicitronOptimizerFactory and ImplicitronModelFactory are new high-level factories that create the optimizer, scheduler, model, and stats objects.
4) TrainingLoop is a new configurable that runs the main training loop and the inner train-validate step.
5) Evaluator is a new configurable that TrainingLoop uses to run validation/test steps.
6) GenericModel is not the only model choice anymore. Instead, ImplicitronModelBase (by default instantiated with GenericModel) is a member of Experiment and can be easily replaced by a custom implementation by the user.

All the new Configurables are children of ReplaceableBase, and can be easily replaced with custom implementations.

In addition, I added support for the exponential LR schedule, updated the config files and the test, as well as added a config file that reproduces NERF results and a test to run the repro experiment.

Reviewed By: bottler

Differential Revision: D37723227

fbshipit-source-id: b36bee880d6aa53efdd2abfaae4489d8ab1e8a27
2022-07-29 17:32:51 -07:00