티스토리 뷰
[머신러닝-Tensorflow]Lec-07 multi-variable linear regression을 TensorFlow에서 구현하기
감자형 2018. 3. 11. 16:511. Hypothesis None using matrix -1
import tensorflow as tf | |
tf.set_random_seed(777) # for reproducibility | |
x1_data = [73., 93., 89., 96., 73.] | |
x2_data = [80., 88., 91., 98., 66.] | |
x3_data = [75., 93., 90., 100., 70.] | |
y_data = [152., 185., 180., 196., 142.] | |
# placeholders for a tensor that will be always fed. | |
x1 = tf.placeholder(tf.float32) | |
x2 = tf.placeholder(tf.float32) | |
x3 = tf.placeholder(tf.float32) | |
Y = tf.placeholder(tf.float32) | |
w1 = tf.Variable(tf.random_normal([1]), name='weight1') | |
w2 = tf.Variable(tf.random_normal([1]), name='weight2') | |
w3 = tf.Variable(tf.random_normal([1]), name='weight3') | |
b = tf.Variable(tf.random_normal([1]), name='bias') | |
hypothesis = x1 * w1 + x2 * w2 + x3 * w3 + b | |
print(hypothesis) | |
# cost/loss function | |
cost = tf.reduce_mean(tf.square(hypothesis - Y)) | |
# Minimize. Need a very small learning rate for this data set | |
optimizer = tf.train.GradientDescentOptimizer(learning_rate=1e-5) | |
train = optimizer.minimize(cost) | |
# Launch the graph in a session. | |
sess = tf.Session() | |
# Initializes global variables in the graph. | |
sess.run(tf.global_variables_initializer()) | |
for step in range(2001): | |
cost_val, hy_val, _ = sess.run([cost, hypothesis, train], | |
feed_dict={x1: x1_data, x2: x2_data, x3: x3_data, Y: y_data}) | |
if step % 10 == 0: | |
print(step, "Cost: ", cost_val, "\nPrediction:\n", hy_val) | |
2. Hypothesis using matrix -1
import tensorflow as tf | |
tf.set_random_seed(777) # for reproducibility | |
x_data = [[73., 80., 75.], | |
[93., 88., 93.], | |
[89., 91., 90.], | |
[96., 98., 100.], | |
[73., 66., 70.]] | |
y_data = [[152.], | |
[185.], | |
[180.], | |
[196.], | |
[142.]] | |
# placeholders for a tensor that will be always fed. | |
X = tf.placeholder(tf.float32, shape=[None, 3]) | |
Y = tf.placeholder(tf.float32, shape=[None, 1]) # none의 의미는 내가 원하는 개수만큼 처리할수 있다는 의미이고, X의 3은 인스턴스의 개수가 3개라는 의미 Y에서의 1은 인스턴스의 개수가 1개 | |
W = tf.Variable(tf.random_normal([3, 1]), name='weight') | |
b = tf.Variable(tf.random_normal([1]), name='bias') | |
# Hypothesis | |
hypothesis = tf.matmul(X, W) + b | |
# Simplified cost/loss function | |
cost = tf.reduce_mean(tf.square(hypothesis - Y)) | |
# Minimize | |
optimizer = tf.train.GradientDescentOptimizer(learning_rate=1e-5) | |
train = optimizer.minimize(cost) | |
# Launch the graph in a session. | |
sess = tf.Session() | |
# Initializes global variables in the graph. | |
sess.run(tf.global_variables_initializer()) | |
for step in range(2001): | |
cost_val, hy_val, _ = sess.run( | |
[cost, hypothesis, train], feed_dict={X: x_data, Y: y_data}) | |
if step % 10 == 0: | |
print(step, "Cost: ", cost_val, "\nPrediction:\n", hy_val) |
3. 출력 결과
(0, 'Cost: ', 752.86066, '\nPrediction:\n', array([[133.76956],
[151.47867],
[154.42642],
[165.63803],
[114.93678]], dtype=float32))
(10, 'Cost: ', 17.752771, '\nPrediction:\n', array([[157.73633],
[180.31572],
[182.82404],
[196.56433],
[136.93816]], dtype=float32))
(20, 'Cost: ', 17.652256, '\nPrediction:\n', array([[157.79259],
[180.41405],
[182.90495],
[196.65446],
[137.01912]], dtype=float32))
(30, 'Cost: ', 17.558985, '\nPrediction:\n', array([[157.77658],
[180.42542],
[182.9002 ],
[196.65134],
[137.03374]], dtype=float32))
(40, 'Cost: ', 17.466183, '\nPrediction:\n', array([[157.76039],
[180.4365 ],
[182.8952 ],
[196.64792],
[137.04811]], dtype=float32))
(50, 'Cost: ', 17.373915, '\nPrediction:\n', array([[157.74423],
[180.44754],
[182.89023],
[196.64452],
[137.06242]], dtype=float32))
(60, 'Cost: ', 17.28209, '\nPrediction:\n', array([[157.72815],
[180.45859],
[182.88528],
[196.64116],
[137.07675]], dtype=float32))
(70, 'Cost: ', 17.190739, '\nPrediction:\n', array([[157.71205],
[180.46956],
[182.88033],
[196.63776],
[137.091 ]], dtype=float32))
(80, 'Cost: ', 17.09996, '\nPrediction:\n', array([[157.69604],
[180.48053],
[182.87541],
[196.63438],
[137.10522]], dtype=float32))
(90, 'Cost: ', 17.00963, '\nPrediction:\n', array([[157.68007],
[180.49147],
[182.87048],
[196.63103],
[137.1194 ]], dtype=float32))
(100, 'Cost: ', 16.919806, '\nPrediction:\n', array([[157.66414],
[180.50237],
[182.86557],
[196.62767],
[137.13354]], dtype=float32))
(110, 'Cost: ', 16.830479, '\nPrediction:\n', array([[157.64824],
[180.51323],
[182.86069],
[196.62433],
[137.14764]], dtype=float32))
(120, 'Cost: ', 16.741568, '\nPrediction:\n', array([[157.63239],
[180.52408],
[182.85579],
[196.621 ],
[137.16171]], dtype=float32))
(130, 'Cost: ', 16.65318, '\nPrediction:\n', array([[157.61658],
[180.5349 ],
[182.85092],
[196.61768],
[137.17574]], dtype=float32))
(140, 'Cost: ', 16.565327, '\nPrediction:\n', array([[157.60083],
'AI' 카테고리의 다른 글
[머신러닝-Tensorflow]Lec-09 Logic Classfication 가설 함수 정의 (0) | 2018.03.16 |
---|---|
[머신러닝-Tensorflow]Lec-08 TensorFlow로 파일에서 데이터 파일 읽기 (0) | 2018.03.11 |
[머신러닝-Tensorflow] Lec-06 multi-variable linear regression (0) | 2018.03.11 |
[머신러닝-Tensorflow] Lec-05 Cost Minimize 실습 (0) | 2018.03.11 |
[머신러닝-Tensorflow] Lec-04 Linear Regression Cost 최소화 알고리즘 (0) | 2018.03.11 |
- Total
- Today
- Yesterday
- 리버싱
- 백준알고리즘
- 머신러닝
- Algorigm
- Controller
- 개발하는 관광이
- 복습
- 스프링
- BFS
- 프로그래밍
- 백준
- Android
- 노드
- TensorFlow
- C언어
- 학교
- Spring
- 감자개발자
- 감자코딩
- db
- MVC
- programming
- 초보자를 위한 C언어 300제
- node
- node.js
- 안드로이드
- 텐서플로우
- C langauge
- 알고리즘
- 코드엔진
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 |