TVM/Relay 的 PartitionGraph()(mod) 函数讨论整理
TVM/Relay 的图形分区功能。以下简单示例,错误信息。
PartitionGraph() 函数指定图形是用带有 AnnotateTarget([“target”]) 函数的目标注释的。编写了以下示例,以便能够将“add”运算符划分为一个单独的功能函数(使用relay模式语言,或遍历 AST,将add划分为一个单独的relay函数),试图了解 PartitionGraph() 如何在简单情况下工作。
这是代码:
graph_type =1
def _register_external_op_helper(op_name, supported=True):
@tvm.ir.register_op_attr(op_name, "target.special")
def _func_wrapper(attrs, args):
return supported
return _func_wrapper
_register_external_op_helper(“add”)
_register_external_op_helper(“subtract”)
if graph_type == 1:
# this is test case for graph type 1
print(“Graph type 1”)
# graph 1: true branch
x1 = relay.var('x', shape=(10, 1))
y1 = relay.var('y', shape=(10, 1))
# graph 2: false branch
x2 = relay.var('x', shape=(10, 1))
y2 = relay.var('y', shape=(10, 1))
f1 = relay.op.add(x1, y1)
f2 = relay.op.multiply(x2, y2)
cond = relay.var('c')
result = relay.If(cond, true_branch=f1, false_branch=f2)
f = relay.Function([], result)
mod = tvm.IRModule({"main": f})
mod = relay.transform.AnnotateTarget(["special"])(mod) # ==> It GIVES ERROR here
mod = relay.transform.PartitionGraph()(mod) #
这是错误信息。
Graph type 1
Traceback (most recent call last):
File “C:\Program Files\JetBrains\PyCharm 2020.1.2\plugins\python\helpers\pydev\pydevd.py”, line 1438, in _exec
pydev_imports.execfile(file, globals, locals) # execute the script
File “C:\Program Files\JetBrains\PyCharm 2020.1.2\plugins\python\helpers\pydev_pydev_imps_pydev_execfile.py”, line 18, in execfile
exec(compile(contents+"\n", file, ‘exec’), glob, loc)
File “C:/repos/tvm23/tvm/graph_opt/subgraph/PartitionGraphTry.py”, line 48, in
mod = relay.transform.AnnotateTarget([“special”])(mod) # Output: Figure 2
File “C:\repos\tvm23\tvm\python\tvm\ir\transform.py”, line 127, in call
return _ffi_transform_api.RunPass(self, mod)
File “C:\repos\tvm23\tvm\python\tvm_ffi_ctypes\packed_func.py”, line 237, in call
raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
File “C:\repos\tvm23\tvm\src\ir\module.cc”, line 192
TVMError: Check failed: fv.size() == 0 (5 vs. 0) : There are free variables: [Var(c, ty=TensorType([], bool)), Var(x, ty=TensorType([10, 1], float32)), Var(y, ty=TensorType([10, 1], float32)), Var(x, ty=TensorType([10, 1], float32)), Var(y, ty=TensorType([10, 1], float32))] in function: #[version = “0.0.5”]
fn () -> Tensor[(10, 1), float32] {
free_var %c: bool;
if (%c) {
free_var %x: Tensor[(10, 1), float32];
free_var %y: Tensor[(10, 1), float32];
add(%x, %y) /* ty=Tensor[(10, 1), float32] /
} else {
free_var %x1: Tensor[(10, 1), float32];
free_var %y1: Tensor[(10, 1), float32];
multiply(%x1, %y1) / ty=Tensor[(10, 1), float32] */
}
}
可能的错误原因
1) the if/else handling in this pass might not be correct.
2) apache/incubator-tvm/blob/main/tests/python/relay/test_pass_annotate_target.py
f = relay.Function([x], out)
mod = tvm.IRModule.from_expr(f)
return mod
mod = transform.AnnotateTarget("A")(before())
mod = transform.AnnotateTarget("B")(mod)
expected = transform.AnnotateTarget(["A", "B"])(before())
assert tvm.ir.structural_equal(expected, mod)
def test_if_else():
target = “test_if_else”
@tvm.ir.register_op_attr(“equal”, “target.” + target)
def relu(attrs, args): # pylint: disable=unused-variable
return True
@tvm.ir.register_op_attr(“tanh”, “target.” + target)
def tanh(attrs, args): # pylint: disable=unused-variable
return True
3) Isn’t it simply a problem of free variables? I suggest replacing
f = relay.Function([], result)
with
f = relay.Function(relay.