其他
TensorFlow 官方工具 TF-Coder,自动生成代码
(点击上方公众号,可快速关注)
转自:机器之心
项目地址:https://github.com/google-research/tensorflow-coder
Google Colab 试用地址:https://colab.research.google.com/github/google-research/tensorflow-coder/blob/master/TF-Coder_Colab.ipynb
论文地址:https://arxiv.org/pdf/2003.09040.pdf
inputs = {
rows : [10, 20, 30],
cols : [1, 2, 3, 4],
}
output = [[11, 12, 13, 14],
[21, 22, 23, 24],
[31, 32, 33, 34]]
tf.add(cols, tf.expand_dims(rows, 1))
# Input tensors
boundaries = [10, 50, 100, 1000]
prices = [15, 3, 50, 90, 100, 1001]
# Output tensor
bucketed_prices = [1, 0, 2, 2, 3, 4]
# Input-output example
inputs = {
boundaries : [10, 50, 100, 1000],
prices : [15, 3, 50, 90, 100, 1001],
}
output = [1, 0, 2, 2, 3, 4]
tf.searchsorted(boundaries, prices, side= right )
# Input tensor
scores = [[0.7, 0.2, 0.1],
[0.4, 0.5, 0.1],
[0.4, 0.4, 0.2],
[0.3, 0.4, 0.3],
[0.0, 0.0, 1.0]]
# Output tensor
top_scores = [[1, 0, 0],
[0, 1, 0],
[1, 0, 0],
[0, 1, 0],
[0, 0, 1]]
# Input-output example
inputs = {
scores : [[0.7, 0.2, 0.1],
[0.4, 0.5, 0.1],
[0.4, 0.4, 0.2],
[0.3, 0.4, 0.3],
[0.0, 0.0, 1.0]],
}
output = [[1, 0, 0],
[0, 1, 0],
[1, 0, 0],
[0, 1, 0],
[0, 0, 1]]
tf.cast(tf.one_hot(tf.argmax(scores, axis=1), 3), tf.int32)
# Input tensor
counts = [[0, 1, 0, 0],
[0, 1, 1, 0],
[1, 1, 1, 1]]
# Output tensor
normalized = [[0.0, 1.0, 0.0, 0.0],
[0.0, 0.5, 0.5, 0.0],
[0.25, 0.25, 0.25, 0.25]]
# First attempt
normalized = tf.divide(counts, tf.reduce_sum(counts, axis=1))
代码中 axis 的值正确吗?是否应改为 axis=0?
counts 和 tf.reduce_sum(counts, axis=1) 的形状与除法兼容吗?需要改变形状或执行转置操作吗?
counts 和 tf.reduce_sum(counts, axis=1) 都是 tf.int32 张量。tf.int32 张量可以被除吗?是否需要先将其转换为 float 数据类型?
两个参数的顺序对吗?是否需要调换位置?
输出的类型是 tf.int32、tf.float32,还是别的什么?
是否存在更简单或更好的方式?
# Input-output example
inputs = {
counts : [[0, 1, 0, 0],
[0, 1, 1, 0],
[1, 1, 1, 1]],
}
output = [[0.0, 1.0, 0.0, 0.0],
[0.0, 0.5, 0.5, 0.0],
[0.25, 0.25, 0.25, 0.25]]
tf.cast(tf.divide(counts, tf.expand_dims(tf.reduce_sum(counts, axis=1), axis=1)), tf.float32)
- EOF -
1、TensorFlow 惊现大 bug?网友:这是逼着我们用 PyTorch 啊!
看完本文有收获?请转发分享给更多人
关注「大数据与机器学习文摘」,成为Top 1%