gpu: Make glTexSubImage2D work with GL_SRGB_ALPHA on OpenGL

Make glTexSubImage2D work when client passes GL_SRGB_ALPHA as the
format when running on OpenGL.

Considering GL_SRGB_ALPHA texture:
In OpenGL ES 2.0, Tex*Image2D format must be GL_SRGB_ALPHA (same as texture
internal format).
In OpenGL and OpenGL ES 3.0, format must not be GL_SRGB_ALPHA.

The format was handled correctly for glTexImage2D, but not for
glTexSubImage2D. The actual fix is to apply
TextureManager::AdjustTexFormat to glTexSubImage2D format.

Moves the glTexSubImage2D code from decoder to texture manager, and
tries to follow the structure of glTexImage2D. This is due to hoping to
reduce the amount of similar inconsistency bugs. The glTexSubImage3D,
CopySubTexture* and compressed call variants are still potential
sources of inconsistencies. The code duplication between glTexImage2D
and glTexSubImage2D is not addressed, either.

Changes the texture size of
Service/GLES2DecoderTest.TexSubImage2DBadArgs.  The check for behavior
"!pixels -> kOutOfBounds" was moved before argument validation. In the
test, there is a subtest that uses invalid type, GL_UNSIGNED_INT. This
causes the texture data to be bigger than SHM buffer, the unexpected
kOutOfBounds would be returned instead of normal GL error for invalid
type.

BUG=skia:2992

Committed: https://crrev.com/60da545176aed90d91874a456da2bac8b822c67d
Cr-Commit-Position: refs/heads/master@{#356787}

Review URL: https://codereview.chromium.org/1426903002

Cr-Commit-Position: refs/heads/master@{#358578}
10 files changed