aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/compiler/xla/service/llvm_compiler.cc
Commit message (Collapse)AuthorAge
* [XLA] Convert XLA to use xla::se as a namespace alias for ::stream_executor.Gravatar Justin Lebar2018-04-17
| | | | PiperOrigin-RevId: 193301997
* Reset the DAZ bit when entering the XLA CPU/GPU compilerGravatar Sanjoy Das2018-02-16
| | | | | | | | | | In an ideal world this won't make a difference since the compiler should be disciplined about not leaking host-level optimization artifacts into generated code. However, I think this provides some defense-in-depth in preventing non-obvious denormal behavior on the host side from messing up floating point constants etc. we want to embed into generated code. PiperOrigin-RevId: 186061140
* Error out when building XLA's CPU and GPU backends with fast-mathGravatar Sanjoy Das2018-02-15
| | | | | | | | | | In an ideal world this won't make a difference since the compiler should be disciplined about not leaking host-level optimization artifacts into generated code. However, I think this provides some defense-in-depth in preventing fast-math optimization on the host side from messing up floating point constants etc. we want to embed into generated code. PiperOrigin-RevId: 185941549
* [XLA] Add a DeviceAllocator* argument to compilation.Gravatar Justin Lebar2018-01-26
| | | | | | | | In a later change, the GPU backend will use this allocator to reserve scratch memory when trying out different convolution algorithms during compilation. PiperOrigin-RevId: 183469579
* Add a Compiler::BuildExecutable interface that compiles the given Hlo module ↵Gravatar A. Unique TensorFlower2017-11-17
| | | | | | without optimizations. PiperOrigin-RevId: 176158846
* [XLA:CPU/GPU] Implement multi-module compilation for the CPU and GPU backendsGravatar Sanjoy Das2017-11-13
For CPU and GPU this is a simple wrapper around the single-module Compile method since the CPU and GPU backends do not perform cross-module optimizations and analyses. PiperOrigin-RevId: 175631791