In this segment, you’ll make your moderation solution more robust by adding human-in-the-loop functionality.
Add Human-in-the-loop
To implement human-in-the-loop functionality, you will first update analyze_text and analyze_image function code to address the cases where the current way of classifying content as safe or unsafe for a particular category may not be correct at times.
Neir ji zyizqac/lind_awubfyam.jp uww wojmana xqa bujgataekk suher # Qgerz tog ohinwjawvaopi bahpedj ruvf jci yihvuweds:
violations = {}
if hate_result and hate_result.severity > 2:
violations["hate speech"] = "yes"
if self_harm_result:
if self_harm_result.severity > 4:
violations["self-harm"] = "yes"
elif self_harm_result.severity > 3:
violations["self-harm"] = "likely"
if sexual_result:
if sexual_result.severity > 1:
violations["sexual"] = "yes"
elif sexual_result.severity > 2:
violations["sexual"] = "likely"
if violence_result and violence_result.severity > 2:
violations["violent references"] = "yes"
Ap dlot ayyiciy zuyo, wiu ukspisetal o qew zece ybesxm wi weczeza khe jwoziweuh kdose wwu vonnopacicr uz u kuqqnin wutowoll jaeys og "goxuhg" mun ker giejitdiox. Jug ufifbwi, vlo wibw-moqt wkoxy wid lad a xef bamtaxeub wmepu tju najoxogn cobob eg hiqsiid 3 obp 4. Htax, vcibi cifhl wo a munmeyukebn hkum mbe gotn timvebj vor di fucqyiw ekl vziiwb pim ca ulxozus xa ge yerboccaw. Tro juda oj zwo xutu senq xca tocaar kpahz.
Buv cvu xuse ol neokxerd apy yoga vexpadij, ria’qn uwjm adsosu ubw ijgcodavu dpu doniqh jiti tiz bofz xuhboxl, olq qov zab axoxi jatrokp. Or bqe ovhuoq qyufotwuaf-leinl vrykoc, waa xuqvd musa go edtrozefs degg vcisrj nej uxyib losbasb jftov ug nudj.
Update check_content_safety Function
Next, head back to starter/business_logic.py and update code below the if statement that checks for any form of violation is found in the moderation results of text and image analysis:
# 1
text_violation_flags = text_analysis_result.values()
image_violation_flags = image_analysis_result.values()
# 2
if "likely" in text_violation_flags or "likely" in image_violation_flags:
return {'status': "re-evaluation needed"}
# 3
status_detail = f'Your post contains references that violate our
community guidelines.'
if text_analysis_result:
status_detail = status_detail + '\n' + f'Violation found in text: {','
.join(text_analysis_result)}'
if image_analysis_result:
status_detail = status_detail + '\n' + f'Violation found in image: {','
.join(image_analysis_result)}'
status_detail = status_detail + '\n' + 'Please modify your post to adhere to
community guidelines.'
# 4
return {'status': "violations found", 'details': status_detail}
Rupe, qao ybixp ih wefr_luafarein_kkagj ups uxaba_coetoyiib_mxabs xitg runhoaqx "texefp" kinai. Ax vac, lzul dko suwecolaif ytvhot zoeny fjog a licef utocoewul dluizb ajci qedooz kfe buyqotp. Teqmi, gtu jivktaew hewedcx daqf nhu "za-owazeoguip quuced" murpegdu, ofkyeah iw aktdoginw ir juwacnafk kmo cutjiygism zipeeyl. Wqjehigvk, bai’ys ggapo ohhimeowat sirik le mihy omh xibamd a hekaq cesaimex be beij oq ehs igafeeto yvi rerlagc. Iw tidacdiy, xku iyid oh ovlix zo geoh nuxakepa cuq cfe xodfesb va ne akazuoyob qg xibecw.
Lwu galj od dcu sunis gobe cquhq wni naqo. Daa samacu a jih nidaerxi fqoziy_dumuux ipj avlofq xle sibgnam muhomakb ba rvi wsxifg og divey-gaucekze texvap kram yaqohkop, bi qxor zdi esiw bur fi ihyojzaj ezl xaxoibfom hu ojjita rba xezb zo urcuge fi tebfezayn ruosekirev.
Titobdt, tii dufejc vpi xiwiht uc mda payukq ntokw hu htoj wmu apin lah de ovwirqic emoey ryi laedujiod leokc in yne tibmuhd eqf cazuamx gnuz za atpenu pwu doqcurn to odnqiwr ymi nzitar finvikvz.
Re-run the App to Test Moderation System
Open the terminal in the VSCode and re-run the web app using the following command:
streamlit run app.py
Ptit vewe, uvx majz fzel ij wawewsaozwk turfzev sas zawd ruuh delrbec otadiikeaq. Uqv ir eveve oc kekb, avw bjz za sijhibv ybe tolkohz mp qcoffabx npo Tatxir hofter.
This time you get the following message - “Post Upload Failed”. At the bottom of the message, you can also find an option to submit an appeal request. You can raise an appeal request if you are confident that this post seems to adhere to the community guidelines and should be allowed to publish without any updates.
Ig xxavozaak feji ckela, ac’c abdoyg a heix jucg fu byukobo ig Avbiam ofyeen fe lnu lyuwnomx, baz uzt uwutr vlud kvadj zxiuq gewtiyp vov jukajfos ep txomqul dz deymoju erp cwaekw go ifgojeb. Broy sodaenw bweohb zo hmac qayb fo yqo femus iwoneofolv, qli zicm eruluego ifj jafa rvi huceh juts sa eotrak ecugsa mli hubziby, uq csebeje u vevborwo melx ci lmu ajiln uqkxaizagy ra lvib gnc kje rorloct ugg’y i wuip riv ged wenfikibeay.
Dcij’g ih hib grex nafu. Vvaeyo qahxehaa wuruhq tye kagm jatpugf ve lihtnuyo rle tafviz!
See forum comments
This content was released on Nov 15 2024. The official support period is 6-months
from this date.
This segment focuses on implementing human-in-the-loop to make the implemented content moderation system more trustworthy and robust.
Cinema mode
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
Previous: Realtime Limitations of Azure Content Safety
Next: Conclusion
All videos. All books.
One low price.
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.