When creating an AI application on the AI application management page, make sure that any meta model imported from OBS complies with certain specifications.
The model package must contain the model directory. The model directory stores the model file, model configuration file, and model inference code file.
ModelArts also provides custom script examples of common AI engines. For details, see Examples of Custom Scripts.
When publishing the model, you only need to specify the ocr directory.
OBS bucket/directory name |── ocr | ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files | │ ├── <<Custom Python package>> (Optional) Your Python package, which can be directly referenced in model inference code | │ ├── saved_model.pb (Mandatory) Protocol buffer file, which contains the diagram description of the model | │ ├── variables Name of a fixed sub-directory, which contains the weight and deviation rate of the model. It is mandatory for the main file of a *.pb model. | │ │ ├── variables.index Mandatory | │ │ ├── variables.data-00000-of-00001 Mandatory | │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is allowed. | │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.
OBS bucket/directory name |── resnet | ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files | │ ├── <<Custom Python package>> (Optional) Your Python package, which can be directly referenced in model inference code | │ ├── checkpoint_lenet_1-1_1875.ckpt (Mandatory) Model file in ckpt format trained using MindSpore | │ ├── config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is allowed. | │ ├── customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory. | │ ├── tmp.om (Mandatory) An empty .om file that enables the model package to be imported
When publishing the model, you only need to specify the resnet directory.
OBS bucket/directory name |── resnet | ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files | │ ├──config.json (Mandatory) Model configuration file (the address of the SWR image must be configured). The file name is fixed to config.json. Only one model configuration file is allowed.
When publishing the model, you only need to specify the resnet directory.
OBS bucket/directory name |── resnet | ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files | │ ├── <<Custom Python package>> (Optional) Your Python package, which can be directly referenced in model inference code | │ ├── resnet50.pth (Mandatory) PyTorch model file, which contains variable and weight information and is saved as state_dict | │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is allowed. | │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.