2016/02/06 Python東海 第29回勉強会
後藤 俊介 ( @antimon2 )
(インストール詳細は 本家ドキュメント を参照)
環境:
$ julia
julia> Pkg.update()
julia> Pkg.add("MXNet")
↑ ダイナミックリンクライブラリ libmxnet.so のビルドまでやってくれる。
$ pyenv virtualenv 2.7.11 MXNet
$ pyenv shell MXNet
$ pyenv version
MXNet (set by PYENV_VERSION environment variable)
$ pip install -U pip
$ pip install -U numpy matplotlib jupyter
$ cd ~/.julia/v0.4/MXNet/deps/src/mxnet/python
$ pyenv shell MXNet
$ pyenv version
MXNet (set by PYENV_VERSION environment variable)
$ python setup.py install
$ pyenv shell MXNet
$ pyenv version
MXNet (set by PYENV_VERSION environment variable)
$ python -m ipykernel install --user --name mxnet --display-name "MXNet (Python 2)"
Installed kernelspec mxnet in [path/to/JUPYTER_DIR]/kernels/mxnet
import mxnet as mx
OpenCV is unavailable.
import numpy as np
a = mx.nd.empty((2, 3))
a[:] = np.random.uniform(-0.1, 0.1, a.shape)
a.asnumpy()
array([[-0.01725694, -0.0075838 , -0.03712081], [-0.00217939, 0.07316095, 0.05397278]], dtype=float32)
def mnist_iterator(batch_size, input_shape):
"""return train and val iterators for mnist"""
# download data
# get_data.GetMNIST_ubyte()
flat = False if len(input_shape) == 3 else True
train_dataiter = mx.io.MNISTIter(
image="data/train-images-idx3-ubyte",
label="data/train-labels-idx1-ubyte",
input_shape=input_shape,
batch_size=batch_size,
shuffle=True,
flat=flat)
val_dataiter = mx.io.MNISTIter(
image="data/t10k-images-idx3-ubyte",
label="data/t10k-labels-idx1-ubyte",
input_shape=input_shape,
batch_size=batch_size,
flat=flat)
return (train_dataiter, val_dataiter)
train, val = mnist_iterator(batch_size=100, input_shape = (784,))
%matplotlib inline
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.cm as cm
images = train.getdata().asnumpy()
labels = train.getlabel().asnumpy()
f, axs = plt.subplots(1, 3, sharey=True, figsize=(12, 4))
for i in range(3):
image = np.reshape(images[i], [28, 28])
axs[i].imshow(image, cmap=cm.Greys)
axs[i].set_title("Sample Data: %d" % labels[i])
# define 3lp
data = mx.symbol.Variable('data')
fc1 = mx.symbol.FullyConnected(data = data, name='fc1', num_hidden=128)
act1 = mx.symbol.Activation(data = fc1, name='relu1', act_type="relu")
fc2 = mx.symbol.FullyConnected(data = act1, name = 'fc2', num_hidden = 64)
act2 = mx.symbol.Activation(data = fc2, name='relu2', act_type="relu")
fc3 = mx.symbol.FullyConnected(data = act2, name='fc3', num_hidden=10)
mlp = mx.symbol.SoftmaxOutput(data = fc3, name = 'softmax')
# train
model = mx.model.FeedForward(
ctx = mx.cpu(), symbol = mlp, num_epoch = 20,
learning_rate = 0.1, momentum = 0.9, wd = 0.00001)
model.fit(X=train, eval_data=val)
# predict
probs = model.predict(val)
# collect all labels from eval data
val.reset()
labels = np.concatenate(tuple(val.getlabel().asnumpy() for _ in val))
# Now we use compute the accuracy
correct = 0
for i in range(len(labels)):
if np.argmax(probs[i,:]) == labels[i]:
correct += 1
accuracy = 100.0 * correct / len(labels)
print("Accuracy on eval set: %.2f%%" % accuracy)
Accuracy on eval set: 97.60%
# show first-3 eval data
val.reset()
val.next()
images = val.getdata().asnumpy()
f, axs = plt.subplots(1, 3, sharey=True, figsize=(12, 4))
for i in range(3):
image = np.reshape(images[i], [28, 28])
axs[i].imshow(image, cmap=cm.Greys)
axs[i].set_title("Classified as: %d" % np.argmax(probs[i, :]))
val.reset()
ご清聴ありがとうございました。