Skin Cancer Diagnosis and Detection Using Deep Learning
Skin Cancer Diagnosis and Detection Using Deep Learning
Abstract— If early diagnosis and early detection of skin cancer improved patient survival. The developed system allows the
is achieved, many patients can survive. The traditional method for ordinary user to self-diagnose malignant lesions, in addition to
the public has always suffered from problems such as imprecision the opinion of the specialist, so that the subjectivity of the
and biased results. This study therefore presents a method that can conventional diagnostic method ABCDE criteria are met [3]. In
be introduced in a comprehensive workflow to build a healthcare order to detect skin lesions, deep neural networks are used to
system with artificial intelligence. More exactly an intelligent system classify and segment it. To achieve this, the deep learning
that can detect and diagnose skin cancer. While most work that use models employed, such as ResNet, are typically complex and
deep learning techniques focuses either on detection or diagnosis difficult to realize, which hinders public access to these
skin cancer, we have use these techniques to develope a model that
processes. Additionally, the ABCDE indications [4]-based skin
does both tasks at once. Transfer learning was applied on different
models of object detection and diagnosis using a dataset from Kaggle
cancer self-diagnosis method is not practiced strongly by the
with TensorFlow API. The dataset images are a total set of 3297 general public due to its limitations [5].
dermatoscopic images. To train our model, 2637 images were used. This study intends to alter the conventional method of
And to test it, 660 images were used. VGG16 was used as an object treating skin cancer by employing ABCDE indications with
recognition backbone network (for diagnosis) and Yolo as the object object recognition techniques and the deep learning models.
detection framework. The evaluation accuracy of the model was
more than 83 %, which is promising. Abnormal growth of skin cells is a sign of skin cancer, it is
caused either by high sun exposure or other factors [6]. The main
Keywords— skin cancer detection, skin cancer diagnosis, deep
types of skin cancer are divided into two categories, melanoma
learning, yolo.
and non- melanoma. Melanoma skin cancer is considered the
I. INTRODUCTION most severe of all skin cancers [7].
Over the past decade, the number of cases of malignant skin In some people we can see "moles" called "nevi". They’re
cancer has increased significantly, reported by World Health non-cancerous skin lesions, but they can develop like skin
Organization (WHO). It is crucial to detect skin cancer early, cancer, they manifest under different types[8]. In their work,
this makes it possible to classify symptoms and specialists can Lodde et al. [8] has reported that giant moles with an incidence
decide the best arrangements for the patient [1]. Any time, the of 2-13% can develop into malignant skin lesions depending on
process of diagnosing skin cancer has been shown to lead to their incidence. Mole appears harmless (benign), but over time
misdiagnosis due to the doctor's subjective mistakes. Advances it can develop into malignant skin lesions.
in deep learning technology have made it possible to classify and
When skin cancer is detected early, these helps save lives.
detect skin lesions by adopting deep learning neural network
Five-year cancer survival has been shown to be strongly
models for object detection [2].
associated with cancer diagnosis time period [9]. But also, early
Existing methods such as manual examination according to detection increase the survival rate of patients [10].
ABCDE criteria have various limitations due to the different
Melanoma grows early horizontally, then vertically over
levels of experience of dermatologists and the irregularity of
time [11]. Distinguishing melanoma from benign skin lesions at
malignant skin lesions, e.g. subjective and inaccurate [1]. So, our
an early stage is a daunting task, even for skin specialists with
work is a contribution to offer a reliable model able for
high experience[12].
identification and classification skin lesions in real time. When
malignant skin lesions is detected earlier, this is associated with
Authorized licensed use limited to: Zhejiang University. Downloaded on February 27,2024 at 05:41:35 UTC from IEEE Xplore. Restrictions apply.
The first International Conference on Electrical Engineering and Advanced Technologies, ICEEAT23
05-07 November 2023, BATNA, Algeria
Currently, there is a common method of skin self- it [20]. Technological advances in various fields such as
examination for detecting skin lesions in first stage, it's the industry, health and others, have sparked interest in developing
method of the indications "ABCDE", cited at the beginning by object detection methods [21].
[13]. At first, only the «ABCD» tests are considered, the
criterion "E" is added for a improved version in [14]. The Traditionally, the object detection technique is most often
indications "ABCDE" for the early diagnosis of melanoma are performed by manual functions and simple neural network
mentioned in Fig. 1. Based on these indications, the more criteria architectures. Deep learning technology is emerging as an
that meet, the higher the probability of skin cancer [5]. important method to overcome the limitations of traditional
object detection methods [22]. The deep learning system allows
it to learn high-level functions from low-level functions, which
can lead to a better classification of objects without manually
extracting functions [23]. In deep learning, a known model in
image processing is Convolutional Neural Network (CNN)
[24].
A CNN contains input data, such as image information, that
goes to an orientation to produce other information [25].
Examples of CNN architecture are given in the studies detailed
in [26], [27], [28].
Once the layers are created, the entire model must be
formed using a labeled dataset to recognize the object [29].
Most often, in all data we find a part of the test data and a part
of the training data [30].The two parts complement each other.
Learning algorithms was used by CNN to calibrate all
parameters (biases and weights). An example of learning
algorithm is backpropagation [31]. Overcalibration in training
CNNs prevent the model's to well classify invisible data [32].
The problem of over-adjustment was solved by various
Fig. 1. Skin Cancer Diagnosis by ABCDE Indications.
methods cited in [33], [34], [35]. CNNs are often used to
Along with self-examination by the general public or classify an image, such as a popular CNN model known as
beginners, some researchers have said that the ABCDE criteria "VGG16", wich was trained by Pai and Giridharan (2019) for
can be used by dermatologists or doctors to obtain detailed the purpose of classifying seven sorts of skin cancer [36]. CNNs
information on skin cancer, especially for early stage patients also can be a first network in a networks chain for object
[4], [13], [10], [5]. A proposer of this criteria is Friedman et al. detection [21].
Abbasi et al., who proposed the E criteria, concluded that this Several researchers based their studies on deep learning to
method can diagnosis melanoma earlier and can also improve diagnose and detect skin cancer [37], [38], [39], [40], [41], [42],
the ability of non-professionals to diagnose the malignancy of [43]. The diagnostic task is usually to diagnose malignant or
lesions. But after examining the criteria, some researchers benign skin cancer. Some researchers use a single CNN for
suggest limitations of this technique and doubt its efficiency adjust and even for transfer techniques [44], [45], [46].
[15]. It was revealed that respondents could not distinguish and Nevertheless, other researchers develop their own CNN model
see the differences between benign skins, such as nevi or moles. [47], [23] or adapt the same CNN model by passing its elements
A research on the early detection of nodular melanoma to create another [48], [49].
Chamberlain et al. [16], has shown that nodular melanoma skin
lesions do not sometimes follow the ABCD criteria. A check of In this study, we used TensorFlow as a framework. The
the self-diagnosed visual images of a patient’s skin concluded purpose of this choice is the possibility to distribute
that an inexperienced person will have difficulty applying the calculations to more than one CPU or GPU with a single API
"ABCDE" indications without the right images[17]. [68].
Authorized licensed use limited to: Zhejiang University. Downloaded on February 27,2024 at 05:41:35 UTC from IEEE Xplore. Restrictions apply.
The first International Conference on Electrical Engineering and Advanced Technologies, ICEEAT23
05-07 November 2023, BATNA, Algeria
One-stage detectors are less accuracy than two-stage The recall (or sensitivity), and the precision (or specificity),
detectors but are faster. However, two-stage detectors are more are given by following formulas:
accuracy but more slow [61]. Both detectors were compared by
Wu X et al. [54]. Also, Wu X et al. [54] summarized the results Recall TP⁄ TP FN TP⁄All ground truth (2)
by showing that with the same feature extraction backbone, each
detector executes in a different way using the same dataset. Precision TP⁄ TP FP TP⁄All recognitions (3)
Depending on the facts mentioned above and our need to In this study, we have used the recall and the precision to
have more detection speed, we are going to use the one-stage evaluate the diagnosis (object classification).
detector Yolo as an object detection framework in our project.
Our general methodology is given by the scheme in figure
(Fig 2.). Input images contains a folder of benign or malignant V. RESULTS AND DISCUSSION
images (all lesions), and a folder with images of three lesions, The figure below (Fig. 3) shows that among 660 images in
melanoma (MIL), nevi (NV), or basal cell carcinoma (BCC). the dataset, we got 251 of true negative and 299 of true positive
while we got just 49 of false positive and 61 of false negative
which means that the results are good depending the high
number of true positive and true negative.
Authorized licensed use limited to: Zhejiang University. Downloaded on February 27,2024 at 05:41:35 UTC from IEEE Xplore. Restrictions apply.
The first International Conference on Electrical Engineering and Advanced Technologies, ICEEAT23
05-07 November 2023, BATNA, Algeria
a recall of 0.83 (83%) so both precision and recall are high which
means that almost all the predicted classes are correct and almost
all the objects of the ground truth are recognized. The same for
malignant skin cancer diagnosis, the precision is 0.80 (80%) and
the recall is 0.84 (84%) which means that the majority of the
predicted classes are correct and the majority of the objects of
the ground-truth are recognized because of high scores of both
precision and recall. We can conclude that the results are good
with a high accuracy of 0.833 (83.3%).
The next figure (Fig. 5) is some results of skin cancer
diagnosis. The testing model images are selected randomly. The
results are good because most of them are classified correctly
(the green color) as shown below and as calculated above using
confusion matrix, precision and recall. Fig. 6. Malignant and benign skin cancer detection learning curve.
Authorized licensed use limited to: Zhejiang University. Downloaded on February 27,2024 at 05:41:35 UTC from IEEE Xplore. Restrictions apply.
The first International Conference on Electrical Engineering and Advanced Technologies, ICEEAT23
05-07 November 2023, BATNA, Algeria
we have unified the two objectives, which were originally [12] A. F. Jerant, J. T. Johnson, C. D. Sheridan, et T. J. Caffrey, "Early
distinct, to have a complete system, which detects and diagnoses detection and treatment of skin cancer," *Am. Fam. Physician*, vol. 62,
pp. 357-368, 375-376, 381-382, 2000.
skin lesions. We tried to use the best available technologies to
[13] R. J. Friedman, D. S. Rigel, et A. W. Kopf, "Early detection of malignant
develop our method, using for example version 5 of yolo, very melanoma: the role of physician examination and self-examination of the
often cited as perspective in previous studies. skin," *CA Cancer J. Clin.*, vol. 35, pp. 130-151, 1985. [DOI:
10.3322/canjclin.35.3.130].
The evaluation coefficient of our results is promising and can
[14] N. R. Abbasi, H. M. Shaw, D. S. Rigel, R. J. Friedman, W. H. McCarthy,
be improved. Among the desired improvements, we can either I. Osman, A. W. Kopf, et D. Polsky, "Early diagnosis of cutaneous
annotate other new data, i.e. use a larger database, or use other melanoma," *Jama*, vol. 292, pp. 2771-2776, 2004. [DOI:
data addition techniques. Or, we can use yolo version 7 which 10.1001/jama.292.22.2771].
can, perhaps, refine our results, because the yolo version is [15] R. Bränström, M. A. Hedblad, I. Krakau, et H. Ullén, "Laypersons’
important. perceptual discrimination of pigmented skin lesions," *J. Am. Acad.
Dermatol*, vol. 46, pp. 667-673, 2002. [DOI:
This work remains a contribution to the construction and 10.1067/mjd.2002.120463].
validation of a comprehensive early detection and diagnosis [16] A. J. Chamberlain, L. Fritschi, et J. W. Kelly, "Nodular melanoma:
system. The techniques used in this study can be used to patients’ perceptions of presenting features and implications for earlier
diagnose and detect different types of skin cancer or even other detection," *J. Am. Acad. Dermatol*, vol. 48, pp. 694-701, 2003. [DOI:
10.1067/mjd.2003.216].
types of cancer such as brain cancer and breast cancer. These
[17] J. E. McWhirter et L. Hoffman-Goetz, "Visual images for patient skin
techniques can also be improved to be used in other areas such self-examination and melanoma detection: A systematic review of
as self-driving systems, face, eye and fingerprint recognition, published studies," *J. Am. Acad. Dermatol.*, vol. 69, pp. 47-55.e9,
and defense systems in the military field. 2013. [DOI: 10.1016/j.jaad.2013.01.031].
[18] K. Korotkov et R. Garcia, "Computerized analysis of pigmented skin
lesions: a review," *Artif. Intell. Med.*, vol. 56, pp. 69-90, 2012. [DOI:
10.1016/j.artmed.2012.08.002].
VII. REFERENCES
[19] R. Amelard, J. Glaister, A. Wong, et D. A. Clausi, "High-level intuitive
features (HLIFs) for intuitive skin lesion description," *IEEE Trans.
[1] O. Abuzaghleh, B. D. Barkana, et M. Faezipour, "Noninvasive real-time Biomed. Eng.*, vol. 62, pp. 820-831, 2015. [DOI:
automated skin lesion analysis system for melanoma early detection and 10.1109/TBME.2014.2365518].
prevention," IEEE J. Transl. Eng. Heal. Med., vol. 3, pp. 1-12, 2015. [20] E. Nasr-Esfahani, S. Samavi, N. Karimi, et al, "Melanoma detection by
[DOI: 10.1109/JTEHM.2015.2419612]. analysis of clinical images using convolutional neural network," in *2016
[2] A. Taqi, F. Al azzo, A. Awad, et M. Milanova, "Skin lesion detection by 38th Annual International Conference of the IEEE Engineering in
android camera based on SSD-Mobilnet and tensorflow object detection Medicine and Biology Society (EMBC)*. IEEE, pp. 1373-1376, 2016.
API," *Int. J. Adv. Res.*, vol. 3, pp. 5-11, 2019. [DOI: [21] L. Jiao, F. Zhang, F. Liu, S. Yang, L. Li, Z. Feng, et R. Qu, "A survey of
10.5281/zenodo.3264022]. deep learning-based object detection," *IEEE Access*, vol. 7, pp.
[3] L. Thomas, P. Tranchand, F. Berard, T. Secchi, C. Colin, et G. Moulin, 128837-128868, 2019. [DOI: 10.1109/ACCESS.2019.2939201].
"Semiological value of ABCDE criteria in the diagnosis of cutaneous [22] Z. Q. Zhao, P. Zheng, S. T. Xu, et X. Wu, "Object detection with deep
pigmented tumors," *Dermatology*, vol. 197, pp. 11-17, 1998. [DOI: learning: a review," *IEEE Trans. Neural Networks Learn. Syst*, vol. 30,
10.1159/000017969]. pp. 3212-3232, 2019. [DOI: 10.1109/TNNLS.2018.2876865].
[4] A. S. Farberg et D. S. Rigel, "The importance of early recognition of skin [23] E. Nasr-Esfahani, S. Samavi, N. Karimi, et al, "Melanoma detection by
Cancer," *Dermatol. Clin.*, vol. 35, pp. xv-xvi, 2017. [DOI: analysis of clinical images using convolutional neural network," in *2016
10.1016/j.det.2017.06.019]. 38th Annual International Conference of the IEEE Engineering in
[5] H. Tsao, J. M. Olazagasti, K. M. Cordoro, et al, "Early detection of Medicine and Biology Society (EMBC)*. IEEE, pp. 1373-1376, 2016.
melanoma: reviewing the ABCDEs," *J. Am. Acad. Dermatol*, vol. 72, [24] Y. Lecun, L. Bottou, Y. Bengio, et P. Haffner, "Gradient-based learning
pp. 717-723, 2015. [DOI: 10.1016/j.jaad.2015.01.025]. applied to document recognition," *Proc. IEEE*, vol. 86, pp. 2278-2324,
[6] "Skin Cancer Facts & Statistics," The Skin Cancer Foundation, 1998. [DOI: 10.1109/5.726791].
[https://www.skincancer.org/skin-cancer-information/skin-cancer- [25] A. Khan, A. Sohail, U. Zahoora, et A. S. Qureshi, "A survey of the recent
facts/](https://www.skincancer.org/skin-cancer-information/skin-cancer- architectures of deep convolutional neural networks," *Artif. Intell. Rev*,
facts/), 2020. vol. 53, pp. 5455-5516, 2020. [DOI: 10.1007/s10462-020-09825-6].
[7] Z. Yu, X. Jiang, F. Zhou, J. Qin, D. Ni, S. Chen, B. Lei, et T. Wang, [26] Y. Lecun, Y. Bengio, et G. Hinton, "Deep learning," *Nature*, vol. 521,
"Melanoma recognition in Dermoscopy images via aggregated deep pp. 436-444, 2015. [DOI: 10.1038/nature14539].
convolutional features," *IEEE Trans. Biomed. Eng.*, vol. 66, pp. 1006- [27] W. Rawat et Z. Wang, "Deep convolutional neural networks for image
1016, 2019. [DOI: 10.1109/TBME.2018.2866166]. classification: a comprehensive review," *Neural Comput*, vol. 29, pp.
[8] G. Lodde, L. Zimmer, E. Livingstone, D. Schadendorf, et S. Ugurel, 2352-2449, 2017. [DOI: 10.1162/neco_a_00990]..
"Malignant melanoma," *Hautarzt.*, vol. 71, pp. 63-77, 2020. [DOI: [28] A. Krizhevsky, G. E. Hinton, I. Sutskever, et G. E. Hinton, "ImageNet
10.1007/s00105-019-04514-0]. classification with deep convolutional neural networks," *Neural Inf.
[9] A. R. Doben et D. C. MacGillivray, "Current concepts in cutaneous Process Syst 25*, pp. 1-9, 2012. [DOI: 10.1145/3065386].
melanoma: malignant melanoma," *Surg. Clin. North Am.*, vol. 89, pp. [29] Q. Zhang, M. Zhang, T. Chen, Z. Sun, Y. Ma, et B. Yu, "Recent advances
713-725, 2009. [DOI: 10.1016/j.suc.2009.03.003]. in convolutional neural network acceleration," *Neurocomputing*, vol.
[10] A. M. Glazer, D. S. Rigel, R. R. Winkelmann, et A. S. Farberg, "Clinical 323, pp. 37-51, 2019. [DOI: 10.1016/j.neucom.2018.09.038].
diagnosis of skin Cancer: enhancing inspection and early recognition," [30] H. Shin, H. R. Roth, M. Gao, et al, "Deep convolutional neural networks
*Dermatol. Clin.*, vol. 35, pp. 409-416, 2017. [DOI: for computer-aided detection: CNN architectures, dataset characteristics
10.1016/j.det.2017.06.001]. and transfer learning," *IEEE Trans. Med Imaging*, vol. 35, pp. 1285-
[11] W. H. Clark Jr, D. E. Elder, D. Guerry IV, et al, "Model predicting 1298, 2016. [DOI: 10.1109/TMI.2016.2528162].
survival in stage I melanoma based on tumor progression," *JNCI J. Natl. [31] D. E. Rumelhart, G. E. Hinton, et R. J. Williams, "Learning
Cancer Inst.*, vol. 81, pp. 1893-1904, 1989. [DOI: representations by back-propagating errors," *Nature*, vol. 323, pp. 533-
10.1093/jnci/81.24.1893]. 536, 1986. [DOI: 10.1038/323533a0].
Authorized licensed use limited to: Zhejiang University. Downloaded on February 27,2024 at 05:41:35 UTC from IEEE Xplore. Restrictions apply.
The first International Conference on Electrical Engineering and Advanced Technologies, ICEEAT23
05-07 November 2023, BATNA, Algeria
[32] X. Wang, Y. Zhao, et F. Pourpanah, "Recent advances in deep learning," [50] K. He, G. Gkioxari, P. Dollár, et R. B. Girshick, "Mask R-CNN," in *2017
*Int J Mach Learn Cybern*, vol. 11, pp. 747-750, 2020. [DOI: IEEE International Conference on Computer Vision (ICCV)*, 2017, pp.
10.1007/s13042-020-01096-5]. 2980-2988.
[33] C. Shorten et T. M. Khoshgoftaar, "A survey on image data augmentation [51] S. Xie, R. Girshick, P. Dollár, et al., "Aggregated residual transformations
for deep learning," *J Big Data*, vol. 6, p. 60, 2019. [DOI: for deep neural networks," in *2017 IEEE Conference on Computer
10.1186/s40537-019-0197-0]. Vision and Pattern Recognition (CVPR)*, 2017, pp. 5987-5995.
[34] N. Srivastava, G. Hinton, A. Krizhevsky, et al, "Dropout: a simple way to [52] G. Ghiasi, T. Lin, et Q. V. Le, "NAS-FPN: learning scalable feature
prevent neural networks from Overfitting," *J. Mach. Learn Res.*, vol. pyramid architecture for object detection," in *2019 IEEE/CVF
15, pp. 1929-1958, 2014. Conference on Computer Vision and Pattern Recognition (CVPR)*,
[35] S. Ioffe et C. Szegedy, "Batch normalization: accelerating deep network 2019, pp. 7029-7038.
training by reducing internal covariate shift," in *Proceedings of the 32nd [53] S. Liu et W. Deng, "Very deep convolutional neural network based image
International Conference on International Conference on Machine classification using small training sample size," in *2015 3rd IAPR Asian
Learning - volume 37*, JMLR.org, pp. 448-456, 2015. Conference on Pattern Recognition (ACPR)*, 2015, pp. 730-734.
[36] K. Pai et A. Giridharan, "Convolutional neural networks for classifying [54] X. Wu, D. Sahoo, et S. C. Hoi, "Recent advances in deep learning for
skin lesions," in *IEEE Region 10 Annual International Conference object detection," *Neurocomputing*, vol. 396, pp. 39-64, 2020. [DOI:
(TENCON)*. Proceedings/TENCON, IEEE, pp. 1794-1796, 2019. 10.1016/j.neucom.2020.01.085].
[37] L. Bi, D. D. Feng, M. Fulham, et J. Kim, "Multi-label classification of [55] R. Girshick, J. Donahue, T. Darrell, et J. Malik, "Rich feature hierarchies
multi-modality skin lesion via hyper-connected convolutional neural for accurate object detection and semantic segmentation," *Proceedings
network," *Pattern Recogn.*, vol. 107, p. 107502, 2020. [DOI: of the IEEE Computer Society Conference on Computer Vision and
10.1016/j.patcog.2020.107502]. Pattern Recognition*, 2014, pp. 580-587.
[38] I. Giotis, N. Molders, S. Land, M. Biehl, M. F. Jonkman, et N. Petkov, [56] J. R. R. Uijlings, K. E. A. van de Sande, T. Gevers, et A. W. M.
"MED-NODE: a computer-assisted melanoma diagnosis system using Smeulders, "Selective search for object recognition," *Int J Comput Vis*,
non-dermoscopic images," *Expert Syst. Appl.*, vol. 42, 2015. [DOI: vol. 104, pp. 154-171, 2013. [DOI: 10.1007/s11263-013-0620-5].
10.1016/j.Eswa.2015.04.034]. [57] Z. Q. Zhao, P. Zheng, S. T. Xu, et X. Wu, "Object detection with deep
[39] N. Hameed, A. M. Shabut, M. K. Ghosh, et M. A. Hossain, "Multi-class learning: a review," *IEEE Trans Neural Networks Learn Syst*, vol. 30,
multi-level classification algorithm for skin lesions classification using pp. 3212-3232, 2019. [DOI: 10.1109/TNNLS.2018.2876865].
machine learning techniques," *Expert Syst. Appl.*, vol. 141, p. 112961, [58] R. Girshick, "Fast R-CNN," *Proc IEEE Int Conf Comput Vis 2015
2020. [DOI: 10.1016/j.eswa.2019.112961]. Inter*, pp. 1440-1448, 2015. [DOI: 10.1109/ICCV.2015.169].
[40] B. Harangi, A. Baran, et A. Hajdu, "Assisted deep learning framework for [59] S. Ren, K. He, R. Girshick, et J. Sun, "Faster R-CNN: towards real-time
multi-class skin lesion classification considering a binary classification object detection with region proposal networks," *IEEE Trans Pattern
support," *Biomed. Signal Process Control*, vol. 62, p. 102041, 2020. Anal Mach Intell*, vol. 39, pp. 1137-1149, 2017. [DOI:
[DOI: 10.1016/j.bspc.2020.102041]. 10.1109/TPAMI.2016.2577031].
[41] A. Mahbod, P. Tschandl, G. Langs, R. Ecker, et I. Ellinger, "The effects [60] J. Redmon, S. Divvala, R. Girshick, et A. Farhadi, "You only look once:
of skin lesion segmentation on the performance of dermatoscopic image unified, real-time object detection," in *Proc IEEE Comput Soc Conf
classification," *Comput Methods Prog Biomed*, vol. 197, p. 105725, Comput Vis Pattern Recognit*, 2016, pp. 779-788. [DOI:
2020. [DOI: 10.1016/j.cmpb.2020.105725]. 10.1109/CVPR.2016.91].
[42] P. M. M. Pereira, R. Fonseca-Pinto, R. P. Paiva, P. A. A. Assuncao, L. M. [61] P. Soviany et R. T. Ionescu, "Optimizing the trade-off between single-
N. Tavora, L. A. Thomaz, et S. M. M. Faria, "Skin lesion classification stage and two-stage deep object detectors using image difficulty
enhancement using border-line features – the melanoma vs nevus prediction," in *Proceedings-2018 20th International Symposium on
problem," *Biomed Signal Process Control*, vol. 57, p. 101765, 2020. Symbolic and Numeric Algorithms for Scientific Computing, SYNASC
[DOI: 10.1016/j.bspc.2019.101765]. 2018*, 2018, pp. 209-214.
[43] A. Romero-Lopez, X. Giro-i-Nieto, J. Burdick, et O. Marques, "Skin [62] L. K. Meng, A. Khalil, M. H. A. Nizar, et al., "Carpal bone segmentation
lesion classification from dermoscopic images using deep learning using fully convolutional neural network," *Curr Med Imaging*, vol. 15,
techniques," in *Biomedical Engineering*, ACTAPRESS, Calgary, AB, pp. 15-989, 2019. [DOI: 10.2174/1573405615666190724101600].
Canada, pp. 49-54, 2017.
[63] T. Y. Lin, M. Maire, S. Belongie, et al., "Microsoft COCO: Common
[44] M. A. Al-Masni, D. H. Kim, et T. S. Kim, "Multiple skin lesions objects in context," *Lect. Notes Comput. Sci. (including Subser. Lect.
diagnostics via integrated deep convolutional networks for segmentation Notes Artif. Intell. Lect. Notes Bioinformatics)*, vol. 8693 LNCS, pp.
and classification," *Comput Methods Prog Biomed*, vol. 190, p. 740-755, 2014.
105351, 2020. [DOI: 10.1016/j.cmpb.2020.105351].
[64] "Jaccard index," *DeepAI*, 2020. [Online]. Available:
[45] L. Bi, J. Kim, E. Ahn, A. Kumar, M. Fulham, et D. Feng, "Dermoscopic https://deepai.org/machine-learning-glossary-and-terms/jaccard-index.
image segmentation via multistage fully convolutional networks," *IEEE
Trans Biomed Eng*, vol. 64, pp. 2065-2074, 2017. [DOI: [65] M. Goyal, M. H. Yap, S. Hassanpour, et M. H. Yap, "Region of interest
10.1109/TBME.2017.2712771]. detection in dermoscopic images for natural data-augmentation.", 2018.
[66] J. Huang, V. Rathod, C. Sun, et al., "Speed/accuracy trade-offs for modern
[46] K. M. Hosny, M. A. Kassem, et M. M. Foaud, "Classification of skin
convolutional object detectors," in *Proc - 30th IEEE Conf Comput Vis
lesions using transfer learning and augmentation with Alex-net," *PLoS
One*, vol. 14, p. e0217293, 2019. [DOI: 10.1371/journal.pone.0217293]. Pattern Recognition, CVPR 2017*, 2017, pp. 3296-3305. [DOI:
10.1109/CVPR.2017.351].
[47] M. A. Albahar, "Skin lesion classification using convolutional neural
[67] W. Liu, D. Anguelov, D. Erhan, et al., "SSD: Single shot multibox
network with novel Regularizer," *IEEE Access*, vol. 7, pp. 38306-
38313, 2019. [DOI: 10.1109/ACCESS.2019.2906241]. detector," *Lect Notes Comput Sci (including Subser Lect Notes Artif
Intell Lect Notes Bioinformatics)*, vol. 9905 LNCS, pp. 21-37, 2016.
[48] A. A. Adegun et S. Viriri, "Deep learning-based system for automatic [DOI: 10.1007/978-3-319-46448-0_2].
melanoma detection," *IEEE Access*, vol. 8, pp. 7160-7172, 2020. [DOI:
10.1109/ACCESS.2019.2962812]. [68] Abadi et al., "TensorFlow: learning functions at scale," *ACM SIGPLAN
Notices*, vol. 51, no. 9, pp. 1-1, 2016.
[49] B. Harangi, "Skin lesion classification with ensembles of deep
convolutional neural networks," *J Biomed Inform*, vol. 86, pp. 25-32,
2018. [DOI: 10.1016/j.jbi.2018.08.006].
Authorized licensed use limited to: Zhejiang University. Downloaded on February 27,2024 at 05:41:35 UTC from IEEE Xplore. Restrictions apply.