<document>
<page>
<par>
<line> Centro Unv*rsitário Santo Agostinho </line>
</par>
<par>
<line> www*.fsanet.com.*r/revista </line>
<line> Rev. FSA, Tere*i*a, *. *8, n. 7, art. 11, p. 173-186, *u*. 20*1 </line>
<line> ISSN Impresso: *806-6356 I*SN Ele*rônico: 2317-2983 </line>
<line> http://dx.doi.org/10.12819/20*1.18.7.11 </line>
<line> *tudy of Pann Components in Image *r*atment fo* Me*ical D**gnostic Decis**n-Making </line>
<line> Est*do de Compon*ntes *a*n no Trata*ento de Image* *ara T*mada de De*isã* de </line>
<line> Dia*n*stico M*dico </line>
<line> Luiz Antônio de *im* </line>
<line> Doutorando em E***nharia de *rodução pela Un*versidad* Paulista </line>
<line> Pr*fessor da Univer*id*de Paulista </line>
<line> E-ma*l: luizlim*@un*p.br </line>
<line> Jair **noro Ab* </line>
<line> Do*tor em Filos*f** pela Uni*ersidade de São P*u*o </line>
<line> Profess*r da Univers*dade Paulista </line>
<line> E-ma*l: ja*rabe@uol.com.b* </line>
<line> Ang*l A*tôni* Gonzalez *ar*i*ez </line>
<line> Doutor*ndo *m Engenharia d* Produção pela Universidade Pa*lista </line>
<line> Professo* d* U*i*er*idad* Pau**sta </line>
<line> E-mail: aagmar*i*ez@gm*il.com </line>
<line> Jona*as Santo* *e *ouza </line>
<line> Mestre em Engenhar*a de Pr*d*ção pela U**v*rsi*ade Paul*sta </line>
<line> E-mail: j**atas151*@gmail.c*m </line>
<line> Flávio A*adeu *ernardin* </line>
<line> Mestran*o em En*enharia d* *rodução p*la Universida*e Pa*l*sta </line>
<line> Professor do SEN*I </line>
<line> *-mail: fl*vioamb**nar@gm*il.com </line>
<line> Nilson Amado de *ouza </line>
<line> M**trando em Engenharia de P*odução p*la Universidade Paulist* </line>
<line> E-mail: *i*son.amado@*mail.com </line>
<line> L*lia* S*yur* Sakamoto </line>
<line> Dou*orando em Engenharia *e Produçã* pel* U*ivers*dade Paulista </line>
<line> E-mail: *iliam.saka*ot*@*mail.co* </line>
</par>
<par>
<line> Ender*ço: Lui* A*tôni* de Lima </line>
</par>
<par>
<line> Av. Paulist*, 900 - *ela Vista, São Pa*lo - SP, 01310- </line>
<line> Edito*-C**fe: *r. Tonny Ke*ley de Alenc*r </line>
</par>
<par>
<line> 10*. Brasil. </line>
<line> Rodr*gues </line>
</par>
<par>
<line> Ender*ço: </line>
<line> Jair Minoro Abe </line>
</par>
<par>
<line> Av. Paulist*, 90* - Be*a Vista, São Paulo - SP, 01310- </line>
<line> Art*go rec*bid* em 14/06/2021. Última </line>
<line> versão </line>
</par>
<par>
<line> *00. Brasil. </line>
<line> r*ceb*da em 27/06/*021. Aprovado em 28/06/2021. </line>
</par>
<par>
<line> Ende*eço: </line>
<line> Angel **tônio Gonzalez Martinez </line>
</par>
<par>
<line> Av. Pa*l**ta, *00 - Bela V**ta, S*o P*u** - SP, 01310- </line>
<line> A*aliad* pelo sis*ema Triple R*v*ew: Desk *eview a) </line>
</par>
<par>
<line> 100. Brasil. </line>
<line> pelo Ed*t*r-*hefe; e b) Doub*e Blind Review </line>
</par>
<par>
<line> E*d**e*o: </line>
<line> **natas Santos *e Souza </line>
<line> (avaliação *eg* por d*is avali*dores da á*ea). </line>
</par>
<par>
<line> Av. Pa*lista, 900 - Bela Vista, São Paulo - SP, 01310- </line>
</par>
<par>
<line> **0. Brasil. </line>
<line> Revisã*: Gramatical, Normativa </line>
<line> e d* F*rmataçã* </line>
</par>
<par>
<line> Endereço: *l*v*o Amadeu Bernardini </line>
</par>
<par>
<line> Av. Paulis*a, 900 - Bel* V*sta, São Paulo - SP, 0131*- </line>
</par>
<par>
<line> 100. Bra*i*. </line>
</par>
<par>
<line> Endereço: Nilson Amado de *ouz* </line>
</par>
<par>
<line> Av. Paulist*, 90* - Bela Vista, *ão Paulo - SP, *1310- </line>
</par>
<par>
<line> 10*. B*as**. </line>
</par>
<par>
<line> End*r*ç*: *iliam S*yur* S**amo*o </line>
</par>
<par>
<line> Av. *a*list*, 900 - B*la Vista, Sã* P**lo - SP, 01*10- </line>
</par>
<par>
<line> 100. *rasil. </line>
</par>
</page>
<page>
<par>
<line> L. *. *i*a, J. M. Abe, A. A. G. *artinez, J. S. Souza, F. A. Berna*dini, N. A. Souza, L. S. Sakamoto </line>
<line> 174 </line>
</par>
<par>
<line> ABSTR*CT </line>
</par>
<par>
<line> The hospital branch has b*nefited from offering activitie* that u*e collections of imag*ng tests </line>
<line> f*r sp*ciali*t* to us* for decision-making in **nju*ction with *ther clinica* examinations. It is </line>
<line> intended to study p**hologies resulti*g from cancer cells. I* this a*ticle, there is the *ossibility </line>
<line> of presenting Art*ficial *nte*ligence solutions to s*pport s*ecialists. *or this, the *bjec*ive i* to </line>
<line> use *he concepts of Paracon*istent Logic and Ar*if*cial Intelli*ence applied in Arti*icial </line>
<line> Neura* Networks and t* propose the us* of components of Pa**consistent A*tificial Neu*al </line>
<line> Networks (PANN) to su*p*rt spe*i*lists in decision-mak*ng. </line>
<line> Keywor*s: Artificial Para*ons*s**nt Neurons. Arti*icial *ntellige*ce. Para*onsistent Logic. </line>
<line> Deep L*ar*i*g Pa*aconsist*nt. </line>
<line> RESUMO </line>
</par>
<par>
<line> O setor *os*ital*r se benefic*ou *e ofe*ecer ati*id*des que utiliza* *oleções ** testes </line>
<line> de </line>
</par>
<par>
<line> imag*m *ara que os e*pec*al**tas possam usar p*r* tomar decisões em conjun*o com outros </line>
<line> *xames clínic*s. O objet*vo é estuda* pa*olo*ias re*ultantes de c*lulas *ancerígenas. Neste </line>
</par>
<par>
<line> *rtig*, </line>
<line> há * pos*ibilidade d* aprese*tar s*l*ções de Inte*i*ên*i* Arti*icial par* apoiar </line>
<line> os </line>
</par>
<par>
<line> *specia*istas. Par* isso, o objetivo é utili*ar os conce*tos de Ló*ica Paraconsistente </line>
<line> e </line>
</par>
<par>
<line> **teligência *rtifici*l aplicados em Redes Neurais Artifi*iais e prop*r o uso ** co*ponentes </line>
<line> de Re*es Neur*** Artif*ciais Pa*aconsistentes (PA*N) par* a*oiar os *s*ec*ali*tas n* t*mada </line>
<line> de decisões. </line>
<line> Palavras-chave: *eurônios Arti**ciais Paraconsi*tentes. Inte*igê*cia Ar**f*cial. Lógic* </line>
<line> Paracons*stente. Deep Lea*ni*g Paraconsistente. </line>
</par>
<par>
<line> Rev. FSA, *ere*ina, v. 18, *. 7, art. 11, p. 173-*86, jul. 2021 </line>
<line> **w4.fsan*t.com.br/r*vista </line>
</par>
</page>
<page>
<par>
<line> *tudy of Pann Components in Imag* Treat**nt for Medic*l Diag*o***c *e*isio*-M*king </line>
<line> 1*5 </line>
</par>
<par>
<line> 1 INTRODUC*ION </line>
</par>
<par>
<line> *es**rch relate* to AI started after the Second W*rl* Wa* and the first work in this </line>
</par>
<par>
<line> are* was carried *ut by Alan Turing (RUSS*L* & NO*VIG, </line>
<line> *010), since t*en *uch </line>
</par>
<par>
<line> *e**arch </line>
<line> has been c*r*ied out. Defining the *oncept of artificia* in*e*ligenc* ver* difficul*. *s </line>
</par>
<par>
<line> F*r *his reason, *r*ificial Intellig*nce *** **d remai*s, a notion that </line>
<line> has mult*ple </line>
</par>
<par>
<line> interpretations, often conflicti** or circula*. </line>
</par>
<par>
<line> *h* diffic*lty of a c*ea* def*n**ion ma* c*me from </line>
<line> **e fact that *here are several </line>
</par>
<par>
<line> huma* faculties **at *r* bei*g reproduced, from *he abilit* to pl*y chess, or in**lved in area* </line>
<line> such as comp*ter vision, *oic* an*l*si*, and synthesis, f**z* logic, artific*a* *eural n*twor*s, </line>
</par>
<par>
<line> an* many ot*ers. I**t*ally, AI </line>
<line> aim** to rep**duce human thought. A*tifici*l I*telligen*e </line>
</par>
<par>
<line> embraced the </line>
<line> i *e a </line>
<line> of repro*ucing human *a*ul*ies *uch as cr*ativ*ty, self-impro**m**t, and </line>
</par>
<par>
<line> th* use of language. Ar*ifici*l Ne*ral Networ*s. </line>
<line> Sect*o* he**ings *hould be left justifi*d, bo*d, with the firs* l*tte* capi**lized and </line>
</par>
<par>
<line> n*mbere* *onse**tively, s*a*ti*g *ith </line>
<line> the Introductio*. Sub-section head**g* shou*d b* in </line>
</par>
<par>
<line> c*pi*al and lower-*as* it*lic le*ter*, **mbered 1.*, 1.2, etc., and l*ft j*stified, with second and </line>
<line> subsequent lines inde*ted. A*l headings should have a mini*um of three text lines *ootnotes. </line>
<line> Warren McCu*loch and *alter Pitts creat*d a **mputat*o*al mod*l for ne*ral </line>
<line> netw*rk* b*s*d on m*themati*s an* a*gori*hms called thres*old logic. This *odel **ved t** </line>
<line> w*y fo* *esearc* o* the neu*al network divid*d i*to t*o a*proa*he*: one approach focused on </line>
<line> biologi*al p**c*sses in the brai*, w*ile the ot**r focused on the a*plication o* neural network* </line>
<line> *o artif*ci*l *ntel*igence. </line>
<line> *he *o*ion of a *etwork of neurons *egins i** first s**ps in *949, Donald *e*b wrote </line>
<line> T*e Organiza*ion of Beh*vi*r, a work th*t pointe* to th* f*c* that neu*al pathway* are </line>
</par>
<par>
<line> s*rengthened each time they are *sed, concept *un*ame*tal*y essenti*l to th* way h*w a </line>
<line> humans learn. If two nerves fire a* t*e same time, he *rgued, the connec*ion be**een t*em is </line>
<line> improved. </line>
<line> I* 195*, Frank **senblatt created *erc*ptron (M**IELSKI, 1972), *n algorith* for </line>
<line> pattern re*o*nition base* on a *w*-*ayer com*utational neural *etwork usi*g sim*le *ddi*ion </line>
<line> and subtr*cti**. He a*so proposed add*tional *ayer* with m*t**matical notations, but that </line>
<line> would not be don* until 197*. </line>
<line> In 1959, B*rnar* Widrow an* M*rci*n Hoff, fro* Stanfo*d, d*velo**d mo*els called </line>
<line> "A*ALIN*" an* "*ADALI*E". *hat was th* fi*st neural network applied to a real problem. </line>
</par>
<par>
<line> Re*. *S*, Teresina *I, *. 18, n. 7, *rt. 11, p. 173-186, jul. 20*1 </line>
<line> www4.fsanet.*om.br/*e*ist* </line>
</par>
</page>
<page>
<par>
<line> *. A. *ima, J. M. Abe, A. A. G. Ma**in*z, J. S. Souza, *. *. B*rn*r*ini, N. A. So*za, L. *. S*kam*to </line>
<line> 17* </line>
</par>
<par>
<line> Ness Rec*rrent *eural Network - RNN net*ork architect*re: The hi*den neurons o* </line>
<line> the recur*ent neur*l netwo** receive the res*lt o* the mathem*tical operation *hat they </line>
</par>
<par>
<line> per*ormed ** the previous time </line>
<line> in addition to the </line>
<line> data from the </line>
<line> previ**s lay*r. Th*s, th* </line>
</par>
<par>
<line> RNNs consider a temporal dep*ndenc* between the input data. Because they *ave </line>
<line> * hi * </line>
</par>
<par>
<line> cha*act*ristic, t*ese networks can **del p***lems wi*h temporal ch**ac*eristics, su*h </line>
<line> as the </line>
</par>
<par>
<line> weat*er forecast giv*n the climate hi*tory in a wi*dow o* the past. </line>
</par>
<par>
<line> The Co*volution*l Ne*ral Network - *NN, or Deep C*nvolutional Ne*work - DCN, </line>
<line> *r si*ply convolutional n*ural ne*wo** ha* a very d*ff*rent structu*e fr*m those presented s* </line>
<line> *ar. In the convolutio* la*ers, the information pas*es thr*ugh seve*al *ilters, which i* pr*ctice </line>
<line> are numeric matrices, with *he fun*tio* o* acce**uatin* r*gu*ar local pat**rns, while reducing </line>
</par>
<par>
<line> th* si*e of the original </line>
<line> data. *h* results of various fi**e*s *re summar**ed by p*oling </line>
</par>
<par>
<line> operati*ns. *n </line>
<line> the dee*est part of *he </line>
<line> co*volutions, data in a reduced dimensional space is </line>
</par>
<par>
<line> expec*ed *o con*ain en*u*h inform*ti*n about these loca* patter** to assign a seman*i* value </line>
<line> *o the origina* data. These *ata th*n go thro*gh a classic FFN st*uct*r* fo* the classif*c*tion </line>
<line> tas*. For these characteri*tics, the most common app*ic*tio* of *NNs is in th* classifi*atio* </line>
<line> of images; the filters *cce*tuate the attribut*s o* the objec*s n*cessary for their correct </line>
</par>
<par>
<line> *lassi*ication. A CNN specialized in class*fying faces, </line>
<line> fo* e*am*le, in the first la*er* </line>
</par>
<par>
<line> recogniz*s c*ntours, curves, *nd bo**e**; further on, it uses this info*mation to recog*ize </line>
<line> mouth, eyes, ear, and nose; and i* th* end, it re*ognizes the entire face. In addit*on to im*ges, </line>
<line> *ny info**ati*n with lo*a* regula*it* can *enefit fro* the use *f *NN*, suc* as audio for </line>
<line> exam*le. </line>
<line> A Paraconsis*ent (ABE et al., 2011). Deep Lear*i*g Ne*work - DL*, als* know* as a </line>
</par>
<par>
<line> Deep *rtifici*l *eural Network - DANN, where the art*fi*i*l </line>
<line> ne*rons are </line>
<line> Paraco*sistent </line>
</par>
<par>
<line> Arti*ic*a* Ne*ron* - PAN. **Ps *re constr*cte* with Paraconsiste*t Neural Units (FILHO, </line>
<line> ABE & **RRES, 2008) *rom different *ami***s. </line>
<line> The stu*ie* on Artificial Neur*l Netwo*ks, Network *omp*nen*s, a*d Parac*nsis*ent </line>
<line> Logic (AKAMA, AB* & NAKAMATSU, 2015), culm*nated i* the cr*ation of the flowchar* </line>
<line> (fig. 1) to materialize the unif*cation of *oncepts. So, we must *se the sequen*e th*t st*rts *n </line>
<line> the defin**ion of the "1- Nuc*eus" which corresponds t* the extrac**on o* the char*cteristic* in </line>
<line> *he spec*fic cas* of images, the "Laplac*an" type was used *ith a f*c** on *dge *etec**on. </line>
<line> Then, the "2- conv*lut**n" (ZHANG, ZHAO & LECUN, 2015). is done specif*cally </line>
<line> in the treatm*nt of images (*ULTEN et al., *019) because i* w*s used as a *odel ** featu*e </line>
<line> *etect*rs (lines, edges). Now "3-norm*l*z****n" is a*plied to standardi*e all in*ut* (text and </line>
<line> Rev. FSA, Ter*sina, *. 1*, n. 7, art. 11, p. 173-186, jul. 2021 *ww*.fsanet.c*m.b*/*evi*ta </line>
</par>
</page>
<page>
<par>
<line> Study of Pann Compone*ts in *m*ge Treatment for Medi*al Dia*nost*c Decision-M*k*ng </line>
<line> **7 </line>
</par>
<par>
<line> ima*es) i* the artificial neural network, which w*uld b* to transform *ll inp*ts in interval* </line>
<line> *et*een 0 and 1 to gu*rantee perform*nce. </line>
<line> According to the comple*ion o* *he stage in the treatment of d*t* (tex* and im*ges), a </line>
</par>
<par>
<line> neural "4-architect*re" (quanti*y of layers and neur**s) is define* according to </line>
<line> t he </line>
</par>
<par>
<line> complexity and avai*able computational capac*ty. </line>
</par>
<par>
<line> Figure 1 - Paraconsist*nt Artificial Neural N*twork* Overview </line>
</par>
<par>
<line> At t*i* p***t, th* stu*y was gui*ed by proposing the use of paraconsistent logic, a*d </line>
<line> t*us were defined, which a*e r*les for obtaining plausible **sults. A** **n*lly, "5- l*a*nin*" </line>
<line> during train**g by the ar*ificia* neural *et*ork and "6- displ*y" the results for analys*s. </line>
</par>
<par>
<line> The b*se *f the CNAPp compo*ent (fig.5) *a* funda*ental for the cr*ation of </line>
<line> t he </line>
</par>
<par>
<line> other components that co*soli*ate t*e para*ons*stent famil*: CNAPpd (f*g. 2), *NAPd (fig. </line>
<line> 3), CNAPco (fig. *). </line>
</par>
<par>
<line> Rev. FSA, Teresina PI, v. 18, n. 7, *rt. 1*, *. 173-186, jul. 2*21 </line>
<line> w*w4.fs*ne*.c*m.*r/re*i*ta </line>
</par>
</page>
<page>
<par>
<line> L. A. Lima, J. M. A*e, A. A. G. Martine*, *. S. Souz*, F. A. B**nardini, N. A. S*uza, L. S. Sak*moto </line>
<line> 178 </line>
</par>
<par>
<line> Figure * - Paraco*siste*t Artificial Neural Compon*n* *f Passage *nd Dec*sion - </line>
<line> CNA*pd </line>
</par>
<par>
<line> Th*s *omponent analyzes the input *vi***ce and outputs two po*sible V or Undefined </line>
<line> *e*ults (1.0.5). </line>
<line> *igure 3 - *araco*s**te*t Artifi*ial Neu*al Component *f Dec***on - CNAP* </line>
</par>
<par>
<line> This component ana*yzes the *nput evidence a*d outp*t* *hree possible re*ults V, *, o* </line>
<line> Undefined (1, </line>
<line> 0,0.5). </line>
</par>
<par>
<line> *ev. FSA, T*resina, v. 1*, n. 7, art. 11, p. 173-186, j*l. 2021 </line>
<line> www4.fsanet.com.br/rev*sta </line>
</par>
</page>
<page>
<par>
<line> Study of Pann Component* in Image Treatm*nt for M*dical D*agn*stic Decision-*akin* </line>
<line> 1*9 </line>
</par>
<par>
<line> Figure * - Paraconsistent Arti*i*ial Neu*al Co*po*ent for Compleme*tation - CNAPco </line>
</par>
<par>
<line> This *ompon*nt has th* fu*ction of complementin* the favo*able *v*dence, ha*ing th* </line>
<line> lim*ts controlled by the to*e*an*e factor. </line>
<line> I* the a*vance*ent of researches an arti*ic*al neural ne*w*rk, it *s under*to*d th*t </line>
<line> they can be added t* the concepts of pa*aconsistent l*gic, providing th* viability sho*n *n the </line>
<line> flowchart (*ig. 1) and with * gre*t capacity *o be a**lied in the systemic prec*pts with </line>
<line> co*putat**nal a*gorithms in th*ir particular*ty in the components basic, learni*g and decision- </line>
</par>
<par>
<line> m*king (CARVALHO & ABE, 2018), as it is possible to obt**n th* *xtraction </line>
<line> of </line>
</par>
<par>
<line> characte*is**cs in *he data made available both in historical bases and in real-time that *an *e </line>
</par>
<par>
<line> *rop*sed by </line>
<line> viewing *a*terns </line>
<line> to s*ppo*t specialists (BALANCIN, 2020) in th*ir </line>
<line> decision- </line>
</par>
<par>
<line> making. </line>
</par>
<par>
<line> 2 </line>
<line> METHODOL*GY </line>
<line> Ini*i*l*y, a bibliog*ap*ic review was carri*d out in Arti**cial Int*l*igence, Dee* </line>
</par>
<par>
<line> Lear**ng (WANG et al., 2018). foc*s*d on the application in lo*ist*cs centers, followed by a </line>
</par>
<par>
<line> re*earc* of </line>
<line> the P*racons*stent Annotated Logic E for application Artificial Intelligence. </line>
</par>
<par>
<line> F*om this proposal, th* progra*mi*g of th* A**if*cia* Intelligence Python language wa* </line>
<line> elab*rated with *he c*nc*p*s of Paraconsistent Evident*al L*g*c E*, thro**h t*e paraconsis*ent </line>
<line> a**orithm, *hich wil* pl*y a fun*amental role in decisi*n-**king assistance (*KAM*, ABE </line>
</par>
<par>
<line> Rev. FSA, *eresina PI, v. 18, n. 7, art. 11, p. 1*3-1*6, j*l. 2021 </line>
<line> www4.fsanet.com.b*/revista </line>
</par>
</page>
<page>
<par>
<line> L. *. Lima, J. M. Abe, *. A. G. Martinez, J. S. Souza, F. A. Bernard*ni, N. A. *ouza, L. *. Sak*moto </line>
<line> 180 </line>
</par>
<par>
<line> & NAKA**TSU, 2015). *or t*e beginning of *he development *f the paracon*istent </line>
<line> *lgorith*, *he reticu*ate (fig. 5) was used as a *efer*nc* (A*E et a*., 20*1). </line>
<line> Figure 5 - Aspect o* the Lat**ce to *ake dec**ion (ABE et al., 201*) </line>
</par>
<par>
<line> 3 </line>
<line> D*SCU*SI*N </line>
<line> In t*e ad*anc*ment of re*ear*hes an artific*al n*ural network, i* is *nderstood that </line>
</par>
<par>
<line> *hey can be *dded to the co*c*pts of paracon*istent logic, provi*ing th* viability show* in the </line>
</par>
<par>
<line> flowchart (fig. *) *nd with * </line>
<line> grea* *apacity to *e app*ied *n the s*stemic pre*ept* wit* </line>
</par>
<par>
<line> computational algo*i*hms *n th**r par*ic*lar*ty in the co*pon*nts ba**c, lear*ing (SIMONE, </line>
</par>
<par>
<line> 2*18) and *ecis*o*-*aking, as it is </line>
<line> pos***le to *btain the **traction o* characteristics t*e in </line>
</par>
<par>
<line> da*a m*d* availa*l* *oth in his*orical ba*es and in re*l-time that can be proposed by *iewin* </line>
<line> patterns ** suppo*t s*ecialists in their *ecis*on-ma*ing. </line>
<line> The parac*ns*stent analyzer un*t should reflect a **t of artificial par*consist*nt </line>
<line> neu*ons capab*e of serving a pa*ticular purpos*. *n general, the p*raco*s**ten* a*t**ici*l </line>
<line> ne***n can contain at l*ast four possible outpu**: False, True Inco*sistent, and P*racomple*e. </line>
<line> Next, w* propose the *euron (fig. 6) with inputs (µ1, *2), adjustment **ctors and </line>
<line> limits (Fat), an* possible outputs (S). This w*th t** possib*lity of meeting extr*me and non- </line>
<line> extreme state*. </line>
</par>
<par>
<line> Rev. FSA, Teresin*, v. 18, *. 7, art. 11, *. 173-186, jul. 2021 </line>
<line> *w**.fsanet.com.b*/revista </line>
</par>
</page>
<page>
<par>
<line> Study of Pann *omponents in Image Treatment for Medical Diagno*t*c Decision-Maki*g </line>
<line> 181 </line>
</par>
<par>
<line> Figure 6 - (*) *a*ac*nsis**nt Neuron Symbol; (b) Art*fi*ial Paraconsistent Neuro*s </line>
</par>
<par>
<line> Cur*ently, the fa*ily of u*i*s is widely disseminated by pr*li*inary studie* an* s*ands </line>
<line> out a* memory units ** as pat**r* sensors in pr*mary layers. We have, for ex*m*le, t*e Basi* </line>
<line> *araconsistent Artificial Neu*al C*ll - CNA*ba, Pa*acons**t*nt Artificial Neural Cell of </line>
</par>
<par>
<line> learning *N*Pa (fig. 8), it *as the *u*ction of *e*rning and *nlearning p*tterns th*t are - </line>
<line> repe*tedly applied at its e*tra***. *nd the Pa*aconsistent *rti*ic*al Neural Cell for dec*sion - </line>
</par>
<par>
<line> C**Pd, has </line>
<line> the function of making *he paraconsistent analy**s and de*ermini** a *e*ision </line>
</par>
<par>
<line> based o* the resu*ts of t*e *na*y*is. This m**es possible the *pp*arance o* several new units </line>
<line> such as the *rop*sed de*i*n o* the Paraconsist*n* Artificia* Neural Unit - U**P2.0 (fig. *). </line>
<line> This has *** function of m*e**ng e**rem* and non-extreme s*at*s. </line>
</par>
<par>
<line> Rev. FSA, Ter*sina PI, v. 18, n. 7, a*t. 11, p. 173-186, jul. 2021 </line>
<line> www4.f**net.com.*r/*evi*ta </line>
</par>
</page>
<page>
<par>
<line> *. A. Lima, J. M. Abe, A. A. G. Martinez, J. S. S*uza, F. A. Bernardini, N. A. Souza, L. S. Sak*m*to </line>
<line> 182 </line>
</par>
<par>
<line> Fig*re 7 - Paraco*sistent Artif*cial Neural **it - UNAP2.0 </line>
</par>
<par>
<line> In the *araconsistent A*ti*icia* Neural Unit - UNAP2.0, it stands o*t *or allowing the </line>
<line> treatment of extreme a*d non-e*treme states (f*g. *). *hus, the analys*s a*d supp**t *o the </line>
<line> spec*alist ca* be adjuste* to plau**ble *e*els during th* ana*ys*s. </line>
<line> The Paraconsiste** Artificial Neura* component stand*rd - CNAPp perf*rms the </line>
<line> *ar*consistent *naly*is through the fo*lo*ing algorithm para-analyzer (*ig. 9). </line>
</par>
<par>
<line> Rev. *SA, Teresina, v. 18, n. 7, art. 11, p. 173-186, jul. 2021 </line>
<line> www4.*sa*et.*om.br/revista </line>
</par>
</page>
<page>
<par>
<line> Study of Pann Components in Im*ge Treatme*t for Me**cal Diagnostic Decision-M*ki*g </line>
<line> 183 </line>
</par>
<par>
<line> Figure 8 - A*t*fici*l I*telligen*e </line>
</par>
<par>
<line> Sourc*: (FILHO; ABE & TOR*E*, 2008) </line>
<line> The p**a-ana*yzer algor*thm (fig. 9) al*o*s the a*pli*ation of paracon*istent log*c and </line>
<line> was rep*ese*ted in modeling language to elucidate th* und*rstanding when materialized in the </line>
<line> computa*ional app*i*ation. </line>
</par>
<par>
<line> Re*. F*A, Teresi*a PI, v. 18, n. 7, art. *1, p. 173-186, j**. 2021 </line>
<line> www*.f*anet.*om.br/revista </line>
</par>
</page>
<page>
<par>
<line> *. A. Lima, J. M. Abe, A. *. G. Martin*z, J. S. Souza, F. A. Bernardini, N. A. S*uza, L. S. Sakamoto </line>
<line> 18* </line>
</par>
<par>
<line> Fig*re * - Flowchart para-anal*zer alg*ri*hm </line>
</par>
<par>
<line> *ou*ce: adapted from (FILHO; ABE & TOR*E*, 20*8) </line>
</par>
<par>
<line> Rev. FSA, *eresina, v. 18, n. 7, art. 11, p. 1*3-18*, jul. *0*1 </line>
<line> www4.fsanet.*om.*r/revist* </line>
</par>
</page>
<page>
<par>
<line> Study o* *ann Comp*n*nts in Image *reatment for Medical **agnostic De*ision-Making </line>
<line> 1*5 </line>
</par>
<par>
<line> 4 CON*LUS*ON </line>
</par>
<par>
<line> The set of images provid* a *etter *nde*standing in th* ana*yzes and that *nvolv* </line>
</par>
<par>
<line> *pec*a***ts. In *iew *f *his m*tivation, it was pr*posed *o </line>
<line> unify the techniques of neural </line>
</par>
<par>
<line> networks </line>
<line> an* para*on*istent logic th*t culmi*ated in *he c*ea*ion of basic steps (fi*. 1) to </line>
</par>
<par>
<line> ap*ly artificial paraconsistent neural n*tworks - PANN. Thus, the construct*on of **e </line>
<line> paraconsistent artificial n*uron (fig. 8) proved feasible, for the creation of a Para*onsistent </line>
<line> Art*ficia* Neu*al Ne**or* - PAN*, *n a compu*er system *apable of handling response </line>
<line> through the network and usin* Paraconsist**t Log*c supporting s*ecialists in decision- </line>
<line> making. </line>
</par>
<par>
<line> ACK*OWLED**MENTS </line>
<line> "This *tudy *as fin**ced i* part by the Coorden*ção de Aperfe*çoamento de Pessoal de Nível </line>
<line> S***rior - Brasil (**PES) -Finance Code 001". </line>
<line> REFER*N*IES </line>
<line> AB*, J. M.; SILVA *IL*O, J. I.; CELESTINO, U.; ARAÚJO, H. *. (2011). Lógica </line>
<line> Paraconsistente Ano*ada Evide*cial **. Comunicar. </line>
<line> AKAM*, S.; ABE, J. M.; NAKAMATSU, K. (2015). Evident*al Reasoni*g in Annotated </line>
</par>
<par>
<line> Logi*s. 2015 IIAI 4th I*te**ational Congr*ss on Advanced Applie* Infor*atic*. Anais... </line>
<line> *n: </line>
</par>
<par>
<line> 2015 IIAI 4TH. </line>
</par>
<par>
<line> BALANCIN, *. L. (*020). Relevância do per*il morfológico, molecular e </line>
<line> im*no*atricial </line>
</par>
<par>
<line> co*o sinal*zadores de alvos ter*pê**icos no me*ot*lioma mali*no: Tes* (Doutorado em </line>
<line> Medicina) - Fac*l*ad* de Med*ci*a da Univers*dade de São Paulo, S** Pau*o, *020. </line>
<line> BULTEN, *.; BÁN*I, P.; HOVEN, J.; LO*, R. V.; LOTZ, J.; WEISS, N.; LAAK, J. *. D.; </line>
<line> GI*NEKEN, B. V.; KAA, C. *.; LITJE*S, *. (2019). Epithelium segm*nta*ion using deep </line>
</par>
<par>
<line> learning in H&E-s*aine* </line>
<line> pr*state sp*cimens with *mmunohistoch*mi*try *s referenc* </line>
</par>
<par>
<line> st*n*ard. S*ientific Reports, 9(1), 864. http*://do*.org/*0.103*/s*1598-01*-37257-4. </line>
<line> CARV*LHO, F. R.; ABE, *. M. (2018). A P*r*consisten* *ecision-Makin* Method, Smart </line>
<line> Innov*tion, Syste*s an* *echnologies volume 87, Sp*inger In**rnationa* P*bl*shing 2*1*. </line>
<line> ISS* 2190-3018 ISSN 2*90-3026 (electronic), ISBN 978-3-319-74109-* ISB* 978-3-319- </line>
</par>
<par>
<line> 74110-9 </line>
<line> (eBook), h***s://do*.org/1*.1007/978-3-319-74*1*-9, L*brary of Congress Control </line>
</par>
<par>
<line> Number: 2018933003. </line>
</par>
<par>
<line> *e*. FSA, Teresina PI, v. 18, n. 7, art. 11, *. 173-186, jul. 2*21 </line>
<line> ww*4.fs*net.com.br/revista </line>
</par>
</page>
<page>
<par>
<line> L. A. Lima, *. M. Abe, A. *. G. Mar*inez, J. S. S*uza, F. A. *ernardini, *. A. So**a, L. S. S*kamoto </line>
<line> *8* </line>
</par>
<par>
<line> FI*H*, J. I. S.; AB*, J. M.; TO*RES, G. *. (2008). Inteligência Art**icial *o* as Redes de </line>
</par>
<par>
<line> An*lises Pa*aconsisten*es. 1. ed. Rio d* Janeiro RJ Brasil: LTC - Livr*s Técni*os </line>
<line> e </line>
</par>
<par>
<line> *ientífi*os S. A., 2*08. </line>
</par>
<par>
<line> *YCIELSKI, *. (1972). Review: Mar*in Minsky and Seymour *aper*, Percep*rons, An </line>
<line> Introdu*tion to Computational Ge*metry. B**leti* of t*e American Mathe**tical Society, v. </line>
<line> 78, n. 1, p. 12-15, jan. 1*72. </line>
<line> RUS*E*L, S. J.; NORVIG, P. (2010). Artifi**al Intellige*ce: A Modern Approach (3rd </line>
<line> edición). Upper Sadd*e River: Pre**ice Hall. ISBN 9*80**6042594. </line>
<line> SIMEONE, *. (2018). A Brief In*r*du*ti** to *achi*e Le*rni*g for En*inee**. Foundati*n* </line>
</par>
<par>
<line> an* </line>
<line> Trends® </line>
<line> in </line>
<line> Sign*l </line>
<line> P*oce*sing, </line>
<line> v. </line>
<line> 12, </line>
<line> n. </line>
<line> 3-4, </line>
<line> p. </line>
<line> *00-431. </line>
</par>
<par>
<line> https://doi.org/10.156*/2000000102. </line>
<line> *ANG, Y., L*UNG, H., GAVRIL*V*, M., ZATAR*IN, O., GRAVES, D., LU, J., </line>
<line> HO*A*D, N., KWONG, S., SHEU, P., & PATE*, S. (*018). A Su*vey and Fo*mal </line>
<line> Analy*es on **que*ce *ear*i** Methodologies an* Deep Neural Netwo*ks. 2018 IEEE 1*th </line>
<line> Inte*national C**ference on Cognitive Info*mati*s & Cog*itive *ompu*ing (ICCI*CC), 6-15. </line>
<line> http*://d*i.org/10.1109/I**I-CC.2018.84*20*2 </line>
<line> ZHANG, X., ZHAO, *., & LEC*N, *. (201*). *harac*er-level Convolutional Network* for </line>
</par>
<par>
<line> Text Clas*ifica*i*n. *dvances in </line>
<line> Neur*l In**rm*tion Processing Systems 28. NIPS *015. </line>
</par>
<par>
<line> https://arxiv.o*g/abs/1509.01626**. </line>
</par>
<par>
<line> Como R*ferenciar e*te *rtigo, conform* ABNT: </line>
<line> *IMA, L. A; A*E, J. M; MARTINEZ, A. A. G; *OU*A, J. S; BERNARDIN*, F. A; SO**A, N. A; </line>
<line> *AKAMOTO, *. S. Study of *ann C**pon*nts in Image Tre*tment for Medi**l Diagnost*c Decision- </line>
<line> Making. Rev. FSA, Tere*ina, v.1*, n. 7, *rt. 11, *. 173-186, jul. *021. </line>
</par>
<par>
<line> *ontribuiçã* dos Autores </line>
<line> *. A. Lima </line>
<line> J. M . Abe </line>
<line> A. A. G. Ma*tinez </line>
<line> J. S. *ouza </line>
<line> F. *. Be**ardini </line>
<line> N. A. Souza </line>
<line> L. *. Sa**moto </line>
</par>
<par>
<line> *) concepção e pl**ejamento. </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> * </line>
<line> X </line>
<line> X </line>
<line> X </line>
</par>
<par>
<line> 2) análise e interp*etaçã* dos dado*. </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> X </line>
</par>
<par>
<line> 3) el*boração d* rascunho *u na revisão *ríti*a d* cont*údo. </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> * </line>
<line> X </line>
<line> X </line>
<line> X </line>
</par>
<par>
<line> *) *art*c*pa*ão na apro*açã* da versão fin*l do manuscrito. </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> X </line>
<line> X </line>
</par>
<par>
<line> Rev. FSA, Teresina, *. 18, n. 7, *rt. 11, p. 173-186, jul. 2021 </line>
<line> www4.fsa*et.c**.*r/revista </line>
</par>
</page>
</document>

Apontamentos

  • Não há apontamentos.


Licença Creative Commons
Este obra está licenciado com uma Licença Creative Commons Atribuição-NãoComercial-SemDerivações 4.0 Internacional.

Ficheiro:Cc-by-nc-nd icon.svg

Atribuição (BY): Os licenciados têm o direito de copiar, distribuir, exibir e executar a obra e fazer trabalhos derivados dela, conquanto que deem créditos devidos ao autor ou licenciador, na maneira especificada por estes.
Não Comercial (NC): Os licenciados podem copiar, distribuir, exibir e executar a obra e fazer trabalhos derivados dela, desde que sejam para fins não-comerciais
Sem Derivações (ND): Os licenciados podem copiar, distribuir, exibir e executar apenas cópias exatas da obra, não podendo criar derivações da mesma.

 


ISSN 1806-6356 (Impresso) e 2317-2983 (Eletrônico)