0
点赞
收藏
分享

微信扫一扫

OpenVINO之四:转换TensorFlow模型

西特张 2022-08-06 阅读 75


1

2 OpenVINO支持的TENSORFLOW模型

  • Inception v1 、 Inception v2、 Inception v3 、Inception V4 、 Inception ResNet v2
  • MobileNet v1 128、 MobileNet v1 160 、MobileNet v1 224
  • NasNet Large 、 NasNet Mobile
  • ResidualNet-50 、ResidualNet-101 、ResidualNet-152
  • VGG-16 、VGG-19

3 OpenVINO支持的TENSORFLOW层与其在Intermediate Representation (IR)中的对应关系

NUMBER

OPERATION NAME IN TENSORFLOW

LAYER NAME IN THE INTERMEDIATE REPRESENTATION

1

Transpose

Permute

2

LRN

Norm

3

Split

Split

4

SplitV

Split

5

FusedBatchNorm

ScaleShift (can be fused into Convolution or FullyConnected)

6

Relu6

Clamp

7

DepthwiseConv2dNative

Convolution

8

ExpandDims

Constant propagation

9

Slice

Split

10

ConcatV2

Concat

11

MatMul

FullyConnected

12

Pack

Reshapes and Concat

13

StridedSlice

Constant propagation and several cases when StridedSlice can be expressed with Splits

14

Prod

Constant propagation

15

Const

Constant propagation

16

Tile

Tile

17

Placeholder

Input

18

Pad

Fused into Convolution or Pooling layers (not supported as single operation)

19

Conv2D

Convolution

20

Conv2DBackpropInput

Deconvolution

21

Identity

Ignored, does not appear in the IR

22

Add

Eltwise(operation = sum)

23

Mul

Eltwise(operation = mul)

24

Maximum

25

Rsqrt

Power(power=-0.5)

26

Neg

Power(scale=-1)

27

Sub

Eltwise(operation = sum) + Power(scale=-1)

28

Relu

ReLU

29

AvgPool

Pooling (pool_method=avg)

30

MaxPool

Pooling (pool_method=max)

31

Mean

Pooling (pool_method = avg); spatial dimensions are supported only

32

RandomUniform

Not supported

33

BiasAdd

Fused or converted to ScaleShift

34

Reshape

Reshape

35

Squeeze

Reshape

36

Shape

Constant propagation (or layer generation if the “–keep_shape_ops” command line parameter has been specified)

37

Softmax

SoftMax

38

SpaceToBatchND

Supported in a pattern when converted to Convolution layer dilation attribute, Constant propagation

39

BatchToSpaceND

Supported in a pattern when converted to Convolution layer dilation attribute, Constant propagation

40

StopGradient

41

Square Constant propagation

42

Sum

Pool(pool_method = avg) + Eltwise(operation = mul)

43

Range

Constant propagation

44

CropAndResize

ROIPooling (if the the method is ‘bilinear’)

45

ArgMax

ArgMax

46

DepthToSpace

Reshape + Permute + Reshape (works for CPU only because of 6D tensors)

47

ExtractImagePatches

ReorgYolo

48

ResizeBilinear

Interp

49

ResizeNearestNeighbor

Resample

50

Unpack

Split + Reshape (removes dimension being unpacked) if the number of parts is equal to size along given axis

51

AddN

Several Eltwises

52

Concat

Concat

53

Minimum

Power(scale=-1) + Eltwise(operation = max) + Power(scale=-1)

54

Unsqueeze

Reshape

55

RealDiv

Power(power = -1) and Eltwise(operation = mul)

56

SquaredDifference

Power(scale = -1) + Eltwise(operation = sum) + Power(power = 2)

57

Gather

Gather

58

GatherV2

Gather

59

ResourceGather

Gather

60

Sqrt

Power(power=0.5)

61

Square

Power(power=2)

62

Pad

Pad

63

PadV2

Pad

64

MirrorPad

Pad

65

ReverseSequence

ReverseSequence

66

ZerosLike Constant

propagation

参考资料:
1 ​​​Converting a TensorFlow* Model​​​

2 ​​Supported Framework Layers​​


举报

相关推荐

0 条评论