Incubator-mxnet: module 'mxnet.symbol' has no attribute 'SoftmaxOutput'

Created on 18 Sep 2020  ยท  5Comments  ยท  Source: apache/incubator-mxnet

Description

build mxnet with gpu on ubuntu 16.04 and found the aforementioned issue. I use master branch to build mxnet and installed mxnet in my local, build code as follow:

git clone --recursive https://github.com/apache/incubator-mxnet.git mxnet
cd mxnet
cp config/linux_gpu.cmake config.cmake 
rm -rf build
mkdir -p build && cd build
cmake ..
cmake --build . --parallel 16
cd python
python setup.py bdist_wheel
cp ../build/libmxnet.so ~/miniconda3/lib/python3.8/site-packages/mxnet/ 

Error Message

(Paste the complete error message. Please also include stack trace by setting environment variable DMLC_LOG_STACK_TRACE_DEPTH=10 before running your script.)


AttributeError Traceback (most recent call last)
in
3 net = mx.sym.Activation(net, name='relu1', act_type="relu")
4 net = mx.sym.FullyConnected(net, name='fc2', num_hidden=26)
----> 5 net = mx.symbol.SoftmaxOutput(data=net, label = "output",\
6 multi_output=true, use_ignore=true,\
7 ignore_label=false)

AttributeError: module 'mxnet.symbol' has no attribute 'SoftmaxOutput'

To Reproduce

(If you developed your own code, please provide a short script that reproduces the error. For existing examples, please provide link.)

import logging
logging.getLogger().setLevel(logging.INFO)
import mxnet as mx
import numpy as np

mx.random.seed(1234)
fname = mx.test_utils.download('https://s3.us-east-2.amazonaws.com/mxnet-public/letter_recognition/letter-recognition.data')
data = np.genfromtxt(fname, delimiter=',')[:,1:]
label = np.array([ord(l.split(',')[0])-ord('A') for l in open(fname, 'r')])

batch_size = 32
ntrain = int(data.shape[0]*0.8)
train_iter = mx.io.NDArrayIter(data[:ntrain, :], label[:ntrain], batch_size, shuffle=True)
val_iter = mx.io.NDArrayIter(data[ntrain:, :], label[ntrain:], batch_size)
net = mx.sym.Variable('data')
net = mx.sym.FullyConnected(net, name='fc1', num_hidden=64)
net = mx.sym.Activation(net, name='relu1', act_type="relu")
net = mx.sym.FullyConnected(net, name='fc2', num_hidden=26)
net = mx.sym.SoftmaxOutput(data=net, label = "output")
mx.viz.plot_network(net)

Steps to reproduce

(Paste the commands you ran that produced the error.)

  1. just want the above code

Environment

We recommend using our script for collecting the diagnositc information. Run the following command and paste the outputs below:

curl --retry 10 -s https://raw.githubusercontent.com/apache/incubator-mxnet/master/tools/diagnose.py | python

# paste outputs here
----------Python Info----------
Version      : 3.8.2
Compiler     : GCC 7.3.0
Build        : ('default', 'Mar 26 2020 15:53:00')
Arch         : ('64bit', 'ELF')
------------Pip Info-----------
Version      : 20.2.2
Directory    : /home/dluser1/miniconda3/lib/python3.8/site-packages/pip
----------MXNet Info-----------
Version      : 2.0.0
Directory    : /home/dluser1/dl-frameworks/mxnet/python/mxnet
Commit hash file "/home/dluser1/dl-frameworks/mxnet/python/mxnet/COMMIT_HASH" not found. Not installed from pre-built package or built from source.
Library      : ['/home/dluser1/dl-frameworks/mxnet/python/mxnet/../../build/libmxnet.so']
Build features:
โœ” CUDA
โœ” CUDNN
โœ” NCCL
โœ– TENSORRT
โœ” CPU_SSE
โœ” CPU_SSE2
โœ” CPU_SSE3
โœ” CPU_SSE4_1
โœ” CPU_SSE4_2
โœ– CPU_SSE4A
โœ” CPU_AVX
โœ– CPU_AVX2
โœ” OPENMP
โœ– SSE
โœ” F16C
โœ– JEMALLOC
โœ” BLAS_OPEN
โœ– BLAS_ATLAS
โœ– BLAS_MKL
โœ– BLAS_APPLE
โœ” LAPACK
โœ” MKLDNN
โœ” OPENCV
โœ– DIST_KVSTORE
โœ– INT64_TENSOR_SIZE
โœ” SIGNAL_HANDLER
โœ– DEBUG
โœ– TVM_OP
----------System Info----------
Platform     : Linux-4.4.0-187-generic-x86_64-with-glibc2.10
system       : Linux
node         : cnpvgl903653
release      : 4.4.0-187-generic
version      : #217-Ubuntu SMP Tue Jul 21 04:18:15 UTC 2020
----------Hardware Info----------
machine      : x86_64
processor    : x86_64
Architecture:          x86_64
CPU op-mode(s):        32-bit, 64-bit
Byte Order:            Little Endian
CPU(s):                32
On-line CPU(s) list:   0-31
Thread(s) per core:    2
Core(s) per socket:    8
Socket(s):             2
NUMA node(s):          2
Vendor ID:             GenuineIntel
CPU family:            6
Model:                 63
Model name:            Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz
Stepping:              2
CPU MHz:               1295.156
CPU max MHz:           3200.0000
CPU min MHz:           1200.0000
BogoMIPS:              4795.58
Virtualization:        VT-x
L1d cache:             32K
L1i cache:             32K
L2 cache:              256K
L3 cache:              20480K
NUMA node0 CPU(s):     0,2,4,6,8,10,12,14,16,18,20,22,24,26,28,30
NUMA node1 CPU(s):     1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31
Flags:                 fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm epb invpcid_single ssbd ibrs ibpb stibp kaiser tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid cqm xsaveopt cqm_llc cqm_occup_llc dtherm ida arat pln pts md_clear flush_l1d
----------Network Test----------
Setting timeout: 10
Timing for MXNet: https://github.com/apache/incubator-mxnet, DNS: 0.0050 sec, LOAD: 1.0113 sec.
Timing for Gluon Tutorial(en): http://gluon.mxnet.io, DNS: 0.2449 sec, LOAD: 2.2304 sec.
Error open Gluon Tutorial(cn): https://zh.gluon.ai, <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1108)>, DNS finished in 0.18293499946594238 sec.
Timing for FashionMNIST: https://apache-mxnet.s3-accelerate.dualstack.amazonaws.com/gluon/dataset/fashion-mnist/train-labels-idx1-ubyte.gz, DNS: 0.0746 sec, LOAD: 3.4065 sec.
Timing for PYPI: https://pypi.python.org/pypi/pip, DNS: 0.0740 sec, LOAD: 1.7208 sec.
Error open Conda: https://repo.continuum.io/pkgs/free/, HTTP Error 403: Forbidden, DNS finished in 0.04107522964477539 sec.
----------Environment----------
MXNET_GLUON_REPO="https://apache-mxnet.s3.cn-north-1.amazonaws.com.cn/"
KMP_DUPLICATE_LIB_OK="True"
KMP_INIT_AT_FORK="FALSE"


Bug needs triage

Most helpful comment

I think what you need is https://mxnet.apache.org/versions/1.6/api/python/docs/api/gluon/loss/index.html, however, as wkcn said, you should use v1.x branch instead.

All 5 comments

Welcome to Apache MXNet (incubating)! We are on a mission to democratize AI, and we are glad that you are contributing to it by opening this issue.
Please make sure to include all the relevant context, and one of the @apache/mxnet-committers will be here shortly.
If you are interested in contributing to our project, let us know! Also, be sure to check out our guide on contributing to MXNet and our development guides wiki.

Hi @llv22 , SoftmaxOutput has been deprecated in MXNet 2.0 (the master branch, https://github.com/apache/incubator-mxnet/pull/18531).
In order to use SoftmaxOutput, you can use MXNet 1.x (the branch v1.x, https://github.com/apache/incubator-mxnet/tree/v1.x)

@wkcn So if moving to v2.0, any suggestions?I mean, any alternative for it? Could you post a reference of migration?

@llv22 you can use mx.npx.softmax for the same.

I think what you need is https://mxnet.apache.org/versions/1.6/api/python/docs/api/gluon/loss/index.html, however, as wkcn said, you should use v1.x branch instead.

Was this page helpful?
0 / 5 - 0 ratings