INT8 quantization proposal

Good point @tqchen

I have added separate threads. This thread can act as a background post.

  1. Quantizing models (INT8 quantization - Quantizing models)
  2. Code generation for backend (INT8 Quantization - Code generation for backends)