-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bump to depend on iree 3.0.0 packages. #275
base: main
Are you sure you want to change the base?
Conversation
Seems there are some failures which might be due to the update :/ |
Ack. Fixed a few but looks like there are more. |
def_func_op.attributes["translation_info"] = Attribute.parse( | ||
f"#iree_codegen.translation_info<None " | ||
f"workgroup_size=[{','.join(str(x) for x in workgroup_size)}]" | ||
f"#iree_codegen.translation_info<pipeline = None " | ||
f"workgroup_size=[{','.join(str(x) for x in workgroup_size)}] " | ||
f"subgroup_size={subgroup_size}>" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@qedawkins @kuhar I'm still getting errors here in some tests. Not sure what I'm doing wrong
with InsertionPoint(self.def_module.body_block):
def_func_op = func_d.FuncOp(name, def_ftype)
def_func_block = def_func_op.add_entry_block()
def_func_args = list(def_func_block.arguments)
if workgroup_size is not None and subgroup_size is not None:
> def_func_op.attributes["translation_info"] = Attribute.parse(
f"#iree_codegen.translation_info<pipeline = None "
f"workgroup_size=[{','.join(str(x) for x in workgroup_size)}] "
f"subgroup_size={subgroup_size}>"
)
E iree.compiler._mlir_libs._site_initialize.<locals>.MLIRError: Unable to parse attribute:
E error: "#iree_codegen.translation_info<pipeline = None workgroup_size=[128,2,1] subgroup_size=64>":1:32: expected ::mlir::iree_compiler::IREE::Codegen::DispatchLoweringPassPipeline to be one of: CPUDefault, CPUDoubleTilingExpert, CPUConvTileAndDecomposeExpert, Mmt4dTilingExpert, CPUBufferOpsTileAndVectorize, CPUDataTiling, CPULinalgExtTileAndVectorize, LLVMGPUDefault, LLVMGPUBaseLowering, LLVMGPUDistribute, LLVMGPUVectorize, LLVMGPUMatmulSimt, LLVMGPUMatmulTensorCore, LLVMGPUTransposeSharedMem, LLVMGPUWarpReduction, LLVMGPUPackUnPack, LLVMGPUMatmulTensorCoreMmaSync, LLVMGPUVectorDistribute, LLVMGPUPadAndVectorDistribute, LLVMGPUWinogradVectorize, LLVMGPUTileAndFuse, SPIRVBaseLowering, SPIRVBaseDistribute, SPIRVBaseVectorize, SPIRVSubgroupReduce, SPIRVMatmulPromoteVectorize, SPIRVCooperativeMatrixVectorize, SPIRVWinogradVectorize, VMVXDefault, TransformDialectCodegen, Custom, None
E error: "#iree_codegen.translation_info<pipeline = None workgroup_size=[128,2,1] subgroup_size=64>":1:41: failed to parse DispatchLoweringPassPipelineAttr parameter 'value' which is to be a `::mlir::iree_compiler::IREE::Codegen::DispatchLoweringPassPipeline`
E error: "#iree_codegen.translation_info<pipeline = None workgroup_size=[128,2,1] subgroup_size=64>":1:41: failed to parse IREECodegen_TranslationInfoAttr parameter 'passPipeline' which is to be a `IREE::Codegen::DispatchLoweringPassPipelineAttr`
iree/turbine/kernel/compiler/dispatch_codegen.py:167: MLIRError
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I think those workflows still installed the older versions. Failures should go away after pushing a stable release, or I can get them installing the nightly releases too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Passing --upgrade
in the workflows helped. Looks like some of the runners are persistent and don't clear our their venvs between runs. FYI @harsh-nod
See the release tracker at iree-org/iree#19063.