This article illustrates a robotic chess player and a process workflow towards a collaborative robotic manipulation intelligence. The chess piece detecting system consists of a collaborative robot manipulator (UR5e), a 2D vision camera mounted at the end-effector of the manipulator, the physical chess board, and a deep-learning neural network. We discuss the proposed robotic workflow and test the impact of mechanical and data-collection imperfections to the machine learning algorithms. In particular, we reveal quantitative impacts of 1) the size, color, and texture of the chessboard, 2) data augmentation, and 3) the environmental lighting condition on the classification and localization of the chess pieces. Experimental results compare performances under different parameter and environmental configurations that lead to data classification accuracies from 75.67 percent to 97 percent.