diff options
author | Benjamin Kramer <kramerb@google.com> | 2018-07-27 11:04:02 -0700 |
---|---|---|
committer | TensorFlower Gardener <gardener@tensorflow.org> | 2018-07-27 11:09:45 -0700 |
commit | 85a265031bbd61f5924cdbfdc316df6979883581 (patch) | |
tree | e0f28efca4c3c3d0529f65d22bd361bc9900ee46 /tensorflow/stream_executor | |
parent | 22107ea509d16803f33f723d54d313b9be5622cc (diff) |
[XLA:GPU] Only add the cubin if it is available
It's only non-empty if we were able to run ptxas. If the PTX is going to be
JIT'ed by the driver it won't be around. Loading an empty cubin will result in
a fatal error.
PiperOrigin-RevId: 206341931
Diffstat (limited to 'tensorflow/stream_executor')
-rw-r--r-- | tensorflow/stream_executor/module_spec.h | 1 |
1 files changed, 1 insertions, 0 deletions
diff --git a/tensorflow/stream_executor/module_spec.h b/tensorflow/stream_executor/module_spec.h index 212ae7ba9c..75bdfed2d7 100644 --- a/tensorflow/stream_executor/module_spec.h +++ b/tensorflow/stream_executor/module_spec.h @@ -43,6 +43,7 @@ class MultiModuleLoaderSpec { } void AddCudaCubinInMemory(port::ArraySlice<const uint8> cubin_bytes) { + CHECK(!cubin_bytes.empty()); has_cuda_cubin_in_memory_ = true; cuda_cubin_in_memory_ = cubin_bytes; } |