We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
flow.tensor_split
crash is triggered when dim=0, otherwise normal.
import oneflow as flow input_tensor = flow.tensor([[1, 2, 3], [4, 5, 6]]) invalid_indices = [0, 3] output = flow.tensor_split(input_tensor, invalid_indices, dim=0) print("Output tensors:",output)
output:
Traceback (most recent call last): File "test.py", line 6, in <module> F20241205 10:03:01.176000 2474655 slice_kernel.cpp:260] Check failed: large_slice_param.elem_cnt() == small_slice_param.elem_cnt() (6 vs. 9) *** Check failure stack trace: *** output = flow.tensor_split(input_tensor, invalid_indices, dim=0) RuntimeError: Slice start must be less or equal to stop File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/functional/impl/math_functor.cpp", line 2420, in operator() Slice(input, start, stop, step, false) File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/framework/op_interpreter/op_interpreter_util.cpp", line 144, in Dispatch<oneflow::one::Tensor> Dispatch<TensorTuple>(op_expr, inputs, ctx) File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/framework/op_interpreter/op_interpreter_util.cpp", line 135, in Dispatch<oneflow::one::TensorTuple> Dispatch(op_expr, inputs, outputs.get(), ctx) File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/framework/op_interpreter/op_interpreter.cpp", line 96, in Apply internal_->Apply(op_expr, inputs, outputs, ctx) File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/framework/op_interpreter/eager_local_op_interpreter.cpp", line 84, in NaiveInterpret [&]() -> Maybe<const LocalTensorInferResult> { LocalTensorMetaInferArgs ... mut_local_tensor_infer_cache()->GetOrInfer(infer_args)); }() File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/framework/op_interpreter/eager_local_op_interpreter.cpp", line 84, in operator() user_op_expr.mut_local_tensor_infer_cache()->GetOrInfer(infer_args) File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/framework/local_tensor_infer_cache.cpp", line 207, in GetOrInfer Infer(*user_op_expr, infer_args) File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/framework/local_tensor_infer_cache.cpp", line 178, in Infer user_op_expr.InferPhysicalTensorDesc( infer_args.attrs ... ) -> TensorMeta* { return &output_mut_metas.at(i); }) File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/core/framework/op_expr.cpp", line 571, in InferPhysicalTensorDesc physical_tensor_desc_infer_fn_(&infer_ctx) File "/home/ci-user/runners/release/_work/oneflow/oneflow/oneflow/user/ops/slice_op.cpp", line 195, in InferPhysicalTensorDesc Error Type: oneflow.ErrorProto.check_failed_error @ 0x7f52549d09ca google::LogMessage::Fail() @ 0x7f52549d0cb2 google::LogMessage::SendToLog() @ 0x7f52549d0537 google::LogMessage::Flush() @ 0x7f52549d30a9 google::LogMessageFatal::~LogMessageFatal() @ 0x7f52505e1c91 oneflow::WriteSlice<>() @ 0x7f52505eeeb1 oneflow::SliceKernel<>::Compute() @ 0x7f525074e536 oneflow::one::StatefulOpKernel::Compute() @ 0x7f524e9e8cab oneflow::vm::OpCallInstructionUtil::Compute() @ 0x7f524e9e6787 oneflow::vm::OpCallInstructionPolicy::Compute() @ 0x7f524e9e25bc oneflow::vm::Instruction::Compute() @ 0x7f524e9e0a6f oneflow::vm::EpStreamPolicyBase::Run() @ 0x7f524e9ec086 oneflow::vm::StreamPolicy::RunIf() @ 0x7f524e9f36de oneflow::vm::ThreadCtx::TryReceiveAndRun() @ 0x7f524e9f5d2d oneflow::(anonymous namespace)::WorkerLoop() @ 0x7f524e9f611f _ZNSt6thread11_State_implINS_8_InvokerISt5tupleIJPFvPN7oneflow2vm9ThreadCtxERKSt8functionIFvS6_EEES6_ZNS3_14VirtualMachine15CreateThreadCtxENS3_6SymbolINS3_6DeviceEEENS3_10StreamTypeEmEUlS6_E3_EEEEE6_M_runEv @ 0x7f52549e540f execute_native_thread_routine @ 0x7f533c4beb43 (unknown) @ 0x7f533c550a00 (unknown) Aborted (core dumped)
python3 -m oneflow --doctor
path: ['/home/miniconda3/envs/oneflow/lib/python3.9/site-packages/oneflow'] version: 0.9.0 git_commit: 381b12c cmake_build_type: Release rdma: True mlir: True
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Summary
crash is triggered when dim=0, otherwise normal.
Code to reproduce bug
output:
System Information
python3 -m oneflow --doctor
):The text was updated successfully, but these errors were encountered: