If not have a camera in device, showing black background color on android -


recently used cctv camera app if not have camera in device, showing logcat , app died

java.lang.runtimeexception: unable start activity componentinfo{kr.co.iosystem.blackeyeonandroid/kr.co.iosystem.blackeyeonandroid.blackeyeactivity}: java.lang.nullpointerexception     @ android.app.activitythread.performlaunchactivity(activitythread.java:2184)     @ android.app.activitythread.handlelaunchactivity(activitythread.java:2233)     @ android.app.activitythread.access$800(activitythread.java:135)     @ android.app.activitythread$h.handlemessage(activitythread.java:1196)     @ android.os.handler.dispatchmessage(handler.java:102)     @ android.os.looper.loop(looper.java:136)     @ android.app.activitythread.main(activitythread.java:5001)     @ java.lang.reflect.method.invokenative(native method)     @ java.lang.reflect.method.invoke(method.java:515)     @ com.android.internal.os.zygoteinit$methodandargscaller.run(zygoteinit.java:785)     @ com.android.internal.os.zygoteinit.main(zygoteinit.java:601)     @ dalvik.system.nativestart.main(native method)  caused by: java.lang.nullpointerexception     @ org.webrtc.peerconnectionfactory.createvideosource(peerconnectionfactory.java:111)     @ kr.co.iosystem.blackeyeonandroid.blackeyeactivity.oncreate(mainactivity.java:235)     @ android.app.activity.performcreate(activity.java:5231)     @ android.app.instrumentation.callactivityoncreate(instrumentation.java:1087)     @ android.app.activitythread.performlaunchactivity(activitythread.java:2148)     @ android.app.activitythread.handlelaunchactivity(activitythread.java:2233)      @ android.app.activitythread.access$800(activitythread.java:135)      @ android.app.activitythread$h.handlemessage(activitythread.java:1196)      @ android.os.handler.dispatchmessage(handler.java:102)      @ android.os.looper.loop(looper.java:136)      @ android.app.activitythread.main(activitythread.java:5001)      @ java.lang.reflect.method.invokenative(native method)      @ java.lang.reflect.method.invoke(method.java:515)      @ com.android.internal.os.zygoteinit$methodandargscaller.run(zygoteinit.java:785)      @ com.android.internal.os.zygoteinit.main(zygoteinit.java:601)      @ dalvik.system.nativestart.main(native method)  

and source mainactivity.java

@override protected void oncreate(final bundle savedinstancestate) {    super.oncreate(savedinstancestate);    setcontentview(r.layout.activity_main);     string nameoffrontfacingdevice = videocapturerandroid.getnameoffrontfacingdevice();    string nameofbackfacingdevice = videocapturerandroid.getnameofbackfacingdevice();    videocapturerandroid capturer = videocapturerandroid.create(nameoffrontfacingdevice);    .     .     .     } 

if not have camera. capturer return null , test

if (capturer == null || capturer.equals("") == true) {    try {       rebootprocess = runtime.getruntime().exec(new string[]{"su", "-c", "reboot"});    } catch (ioexception e) {       e.printstacktrace();     }  } 

and execute app. rebooted device want if not have camera, showing black color background

if connected camera showing glsurfaceview

glview = (glsurfaceview) findviewbyid(r.id.glview); videorenderergui.setview(glview, null) ; try {      . . . . } catch { } 

perhaps, use fragment? please advice me. think

if (capturer == null || capturer.equals("") == true) {    // showing black background ??..   }  

but don't know showing black background part.

thanks.

@update

mainactivity.java (full oncreate)

@override protected void oncreate(final bundle savedinstancestate) {     super.oncreate(savedinstancestate);     setcontentview(r.layout.activity_main);      string nameoffrontfacingdevice = videocapturerandroid.getnameoffrontfacingdevice();      if (nameoffrontfacingdevice != null) {         videocapturerandroid capturer = videocapturerandroid.create(nameoffrontfacingdevice);          mediaconstraints videoconstraints = new mediaconstraints();         videosource videosource = peerconnectionfactory.createvideosource(capturer, videoconstraints);         localvideotrack = peerconnectionfactory.createvideotrack(video_track_id, videosource);          glview =(glsurfaceview) findviewbyid.(r.id.showing);         videorenderergui.setview(glview, null);         try {             renderer = videorenderergui.creategui(0,0,100,100, videorenderergui.scalingtype.scale_aspect_fill, true);             localvideotrack.addrenderer(renderer);         } catch (exception e) {              e.printstacktrace();         }          mediastream = peerconnectionfactory.createlocalmediastream(local_media_stream_id);         mediastream.addtrack(localvideotrack);     } else {         //space     } 

i changed source advice. occur nullpointerexception

in else part. how can programmatically?

if app using camera then, app should not install in phones lack feature. in manifest put:

<uses-feature android:name="android.hardware.camera"     required="true" /> 

[edit]

according videocapturerandroid source, following line return null if no camera exist:

string nameoffrontfacingdevice = videocapturerandroid.getnameoffrontfacingdevice(); 

so, in case must coming null. however, passing name create. think npe coming from:

videocapturerandroid capturer = videocapturerandroid.create(nameoffrontfacingdevice); // name might null 

so, should put null check here:

if (nameoffrontfacingdevice != null) {     videocapturerandroid capturer = videocapturerandroid.create(nameoffrontfacingdevice); } else {     // other stuff } 

[edit 2]

for devices no camera, capturer still coming out null , creating problem call peerconnectionfactory.createvideosource(capturer, videoconstraints). hence, can make sure call if have camera , avoid npe. now, run problem in glview.onresume() call in activity onresume(). so, must initialise before check camera. check code below suggested fix:

        string nameoffrontfacingdevice = videocapturerandroid.getnameoffrontfacingdevice();         string nameofbackfacingdevice = videocapturerandroid.getnameofbackfacingdevice();         log.i(tag, "videocapturerandroid.getnameoffrontfacingdevice() = " + nameoffrontfacingdevice);         log.i(tag, "videocapturerandroid.getnameofbackfacingdevice() = " + nameofbackfacingdevice);         videocapturerandroid capturer = videocapturerandroid.create(nameoffrontfacingdevice);          // initialising glview here         glview = (glsurfaceview) findviewbyid(r.id.glview);         videorenderergui.setview(glview, null);          mediaconstraints videoconstraints = new mediaconstraints();         if (capturer == null || capturer.equals("")) {             log.d(tag, "not camera");          }         // doing further processing if capturer not null         else {             videosource videosource = peerconnectionfactory.createvideosource(capturer, videoconstraints);              localvideotrack = peerconnectionfactory.createvideotrack(video_track_id, videosource);              try {                 renderer = videorenderergui.creategui(0, 0, 100, 100, videorenderergui.scalingtype.scale_aspect_fill, true);                 renderer_sub = videorenderergui.creategui(72, 72, 25, 25, videorenderergui.scalingtype.scale_aspect_fill, true);                 localvideotrack.addrenderer(renderer_sub);                 localvideotrack.addrenderer(renderer);             } catch (exception e) {                 e.printstacktrace();             }              mediastream = peerconnectionfactory.createlocalmediastream(local_media_stream_id);             mediastream.addtrack(localvideotrack);              imagebutton imagebutton = (imagebutton) findviewbyid(r.id.backbutton);             imagebutton.setonclicklistener(new view.onclicklistener() {                 @override                 public void onclick(view v) {                     movetasktoback(true);                     finish();                     android.os.process.killprocess(android.os.process.mypid());                 }             });         }  

Comments