forked from docs/doc-exports
Reviewed-by: Hasko, Vladimir <vladimir.hasko@t-systems.com> Co-authored-by: Lai, Weijian <laiweijian4@huawei.com> Co-committed-by: Lai, Weijian <laiweijian4@huawei.com>
53 lines
11 KiB
HTML
53 lines
11 KiB
HTML
<a name="EN-US_TOPIC_0000001943974161"></a><a name="EN-US_TOPIC_0000001943974161"></a>
|
|
|
|
<h1 class="topictitle1">Introduction</h1>
|
|
<div id="body0000001164998752"><p id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_p102525456382">When creating an AI application on the AI application management page, make sure that any meta model imported from OBS complies with certain specifications.</p>
|
|
<div class="note" id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_note1783914168169"><img src="public_sys-resources/note_3.0-en-us.png"><span class="notetitle"> </span><div class="notebody"><ul id="EN-US_TOPIC_0000001943974161__ul9875201117424"><li id="EN-US_TOPIC_0000001943974161__li78761811154216">The model package specifications are used when you import one model. If you import multiple models, for example, there are multiple model files, use custom images.</li><li id="EN-US_TOPIC_0000001943974161__li1017614114218">If you want to use an AI engine that is not supported by ModelArts, use a custom image.</li><li id="EN-US_TOPIC_0000001943974161__li3966657164312">To create a custom image, refer to <a href="modelarts_23_0219.html">Custom Image Specifications for Creating an AI Application</a></li><li id="EN-US_TOPIC_0000001943974161__li376043812187">For more examples of custom scripts, see <a href="inference-modelarts-0078.html">Examples of Custom Scripts</a>.</li></ul>
|
|
</div></div>
|
|
<p id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_p796583820131">The model package must contain the <strong id="EN-US_TOPIC_0000001943974161__b188241553462">model</strong> directory. The <strong id="EN-US_TOPIC_0000001943974161__b58240512467">model</strong> directory stores the model file, model configuration file, and model inference code file.</p>
|
|
<ul id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_ul1225294533816"><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li1923644611319"><strong id="EN-US_TOPIC_0000001943974161__b8633101219461">Model files:</strong> The requirements for model files vary according to the model package structure. For details, see <a href="#EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_section828936173917">Model Package Example</a>.</li><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li2253114515389"><strong id="EN-US_TOPIC_0000001943974161__b453042614612">Model configuration file</strong>: The model configuration file must be available and its name is consistently to be <strong id="EN-US_TOPIC_0000001943974161__b9530122694611">config.json</strong>. There must be only one model configuration file. For details about how to edit a model configuration file, see <a href="inference-modelarts-0056.html">Specifications for Editing a Model Configuration File</a>.</li><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li17160121145519"><strong id="EN-US_TOPIC_0000001943974161__b1124333413486">Model inference code file</strong>: It is mandatory. The file name is consistently to be <strong id="EN-US_TOPIC_0000001943974161__b1656538164815">customize_service.py</strong>. There must be only one model inference code file. For details about how to edit model inference code, see <a href="inference-modelarts-0057.html">Specifications for Writing Model Inference Code</a>.<ul id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_ul111278413157"><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li171271741181511">The .py file on which <strong id="EN-US_TOPIC_0000001943974161__b2349649194915">customize_service.py</strong> depends can be directly stored in the <strong id="EN-US_TOPIC_0000001943974161__b1334974911499">model</strong> directory. Use relative import for the custom package.</li><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li13127144141518">The other files on which <strong id="EN-US_TOPIC_0000001943974161__b12622165434913">customize_service.py</strong> depends can be stored in the <strong id="EN-US_TOPIC_0000001943974161__b14622954184910">model</strong> directory. Only absolute paths can be used to access these files. For details, see <a href="inference-modelarts-0057.html#EN-US_TOPIC_0000001910014882__en-us_topic_0172466150_li135956421288">Obtaining an Absolute Path</a>.</li></ul>
|
|
</li></ul>
|
|
<p id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_p1161310181139">ModelArts also provides custom script examples of common AI engines. For details, see <a href="inference-modelarts-0079.html">Examples of Custom Scripts</a>.</p>
|
|
<div class="section" id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_section828936173917"><a name="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_section828936173917"></a><a name="en-us_topic_0172466148_section828936173917"></a><h4 class="sectiontitle">Model Package Example</h4><ul id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_ul5738112214391"><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li167381221395">Structure of the TensorFlow-based model package<p id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_p550623114397"><a name="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li167381221395"></a><a name="en-us_topic_0172466148_li167381221395"></a>When publishing the model, you only need to specify the <span class="filepath" id="EN-US_TOPIC_0000001943974161__filepath6196172395212"><b>ocr</b></span> directory.</p>
|
|
<pre class="screen" id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_screen11341145213397">OBS bucket/directory name
|
|
|── ocr
|
|
| ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files
|
|
| │ ├── <<Custom Python package>> (Optional) Your Python package, which can be directly referenced in model inference code
|
|
| │ ├── saved_model.pb (Mandatory) Protocol buffer file, which contains the diagram description of the model
|
|
| │ ├── variables Name of a fixed sub-directory, which contains the weight and deviation rate of the model. It is mandatory for the main file of a *.pb model.
|
|
| │ │ ├── variables.index Mandatory
|
|
| │ │ ├── variables.data-00000-of-00001 Mandatory
|
|
| │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="EN-US_TOPIC_0000001943974161__b1550442095511">config.json</strong>. Only one model configuration file is allowed.
|
|
| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to<strong id="EN-US_TOPIC_0000001943974161__b9970727165511"> customize_service.py</strong>. Only one model inference code file exists. The files on which <strong id="EN-US_TOPIC_0000001943974161__b1497172745513">customize_service.py</strong> depends can be directly stored in the model directory.</pre>
|
|
</li><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li1823332471014">Structure of the MindSpore-based model package<pre class="screen" id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_screen7345164431016">OBS bucket/directory name
|
|
|── resnet
|
|
| ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files
|
|
| │ ├── <<Custom Python package>> (Optional) Your Python package, which can be directly referenced in model inference code
|
|
| │ ├── checkpoint_lenet_1-1_1875.ckpt (Mandatory) Model file in ckpt format trained using MindSpore
|
|
| │ ├── config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="EN-US_TOPIC_0000001943974161__b1198729105613">config.json</strong>. Only one model configuration file is allowed.
|
|
| │ ├── customize_service.py (Mandatory) Model inference code. The file name is fixed to <strong id="EN-US_TOPIC_0000001943974161__b1567261810564">customize_service.py</strong>. Only one model inference code file exists. The files on which <strong id="EN-US_TOPIC_0000001943974161__b146731918145613">customize_service.py</strong> depends can be directly stored in the model directory.
|
|
| │ ├── tmp.om (Mandatory) An empty .om file that enables the model package to be imported</pre>
|
|
<p id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_p1227283611013"></p>
|
|
</li><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li737020312406">Structure of the image-based model package<p id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_p6955191510401"><a name="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li737020312406"></a><a name="en-us_topic_0172466148_li737020312406"></a>When publishing the model, you only need to specify the <span class="filepath" id="EN-US_TOPIC_0000001943974161__filepath1576020817573"><b>resnet</b></span> directory.</p>
|
|
<pre class="screen" id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_screen1919154744017">OBS bucket/directory name
|
|
|── resnet
|
|
| ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files
|
|
| │ ├──config.json (Mandatory) Model configuration file (the address of the SWR image must be configured). The file name is fixed to <strong id="EN-US_TOPIC_0000001943974161__b1067745505720">config.json</strong>. Only one model configuration file is allowed.</pre>
|
|
</li><li id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li610313145402">Structure of the PyTorch-based model package<p id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_p164232524010"><a name="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_li610313145402"></a><a name="en-us_topic_0172466148_li610313145402"></a>When publishing the model, you only need to specify the <span class="filepath" id="EN-US_TOPIC_0000001943974161__filepath777217814574"><b>resnet</b></span> directory.</p>
|
|
<pre class="screen" id="EN-US_TOPIC_0000001943974161__en-us_topic_0172466148_screen19421103115403">OBS bucket/directory name
|
|
|── resnet
|
|
| ├── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files
|
|
| │ ├── <<Custom Python package>> (Optional) Your Python package, which can be directly referenced in model inference code
|
|
| │ ├── resnet50.pth (Mandatory) PyTorch model file, which contains variable and weight information and is saved as <strong id="EN-US_TOPIC_0000001943974161__b16459720175920">state_dict</strong>
|
|
| │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="EN-US_TOPIC_0000001943974161__b628810332594">config.json</strong>. Only one model configuration file is allowed.
|
|
| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to <strong id="EN-US_TOPIC_0000001943974161__b17146173675916">customize_service.py</strong>. Only one model inference code file exists. The files on which <strong id="EN-US_TOPIC_0000001943974161__b12147113613597">customize_service.py</strong> depends can be directly stored in the <strong id="EN-US_TOPIC_0000001943974161__b1814753685918">model</strong> directory.</pre>
|
|
</li></ul>
|
|
</div>
|
|
</div>
|
|
<div>
|
|
<div class="familylinks">
|
|
<div class="parentlink"><strong>Parent topic:</strong> <a href="inference-modelarts-0054.html">Model Package Specifications</a></div>
|
|
</div>
|
|
</div>
|
|
|