Vision Sensor Works on Gen 1 Brain but not Gen 2 Brain

I have a vision sensor that I have setup to work on a Gen 1 Brain without an issue. I am displaying the x, y position to the brain. When I use the same exact code to connect to my Gen 2 Brain it never will never set object exists to TRUE.

Does the gen 2 brain require any different code than the gen 1 brain?

Thanks in advance.

It should be the same, post your code and I will take a look tomorrow.

1 Like

Here is the code that I have in my GEN1 Brain. I clicked to change the brain to GEN2 and downloaded through the controller. For right now, I am just trying to print the x,y coordinates to the screen. Appreciate you looking at it.

// User defined function
void myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed(double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__x, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__y, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__h, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__w, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__heading, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__speed, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__newheading, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__newspeed);

int Brain_precision = 0, Console_precision = 0, Eye_objectIndex = 0;

float myVariable, objCount;

event purpleFoundMsg = event();
event orFoundMessage = event();
event message1 = event();

// User defined function
void myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed(double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__x, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__y, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__h, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__w, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__heading, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__speed, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__newheading, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__newspeed) {
  Brain_precision = 0;
  Brain.Screen.print("At ");
  Brain.Screen.print(printToBrain_numberFormat(), static_cast<float>(myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__x));
  Brain.Screen.print(" , ");
  Brain.Screen.print(printToBrain_numberFormat(), static_cast<float>(myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__y));
  Brain.Screen.newLine();
}

// "when started" hat block
int whenStarted1() {
  Brain.Screen.clearScreen();
  Brain.Screen.setCursor(1, 1);
  Brain.Screen.print("Hello World!");
  Brain.Screen.newLine();
  return 0;
}

// "when started" hat block
int whenStarted2() {
  while (true) {
    Eye.takeSnapshot(Eye__PU);
    Brain.Screen.clearLine(2);
    Brain.Screen.clearLine(3);
    Brain.Screen.clearLine(4);
    Brain.Screen.clearLine(5);
    if (Eye.objectCount > 0) {
      orFoundMessage.broadcastAndWait();
    }
    else {
      Brain.Screen.setCursor(2, 1);
      Brain.Screen.print("Who took my block\?");
      Brain.Screen.newLine();
      wait(1.0, seconds);
    }
  wait(20, msec);
  }
  return 0;
}

// "when I receive orFoundMessage" hat block
void onevent_orFoundMessage_0() {
  Brain.Screen.setCursor(2, 1);
  Brain.Screen.print("I found  ");
  Brain.Screen.print(printToBrain_numberFormat(), static_cast<float>(Eye.objectCount));
  Brain.Screen.print("block(s).");
  Brain.Screen.newLine();
  myVariable = 1.0;
  repeat(Eye.objectCount) {
    Eye_objectIndex = static_cast<int>(myVariable) - 1;
    myVariable = myVariable + 1.0;
    myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed(Eye.objects[Eye_objectIndex].centerX, Eye.objects[Eye_objectIndex].centerY, Eye.objects[Eye_objectIndex].width, Eye.objects[Eye_objectIndex].height, 1.0, 1.0, 1.0, 1.0);
    wait(20, msec);
  }
  wait(1.0, seconds);
}


int main() {
  // register event handlers
  orFoundMessage(onevent_orFoundMessage_0);

  wait(15, msec);
  vex::task ws1(whenStarted2);
  whenStarted1();
}

Here is the image of the vision camera. So it should clearly see the purple (PU) block and return 166, 72.

Just an FYI, I have tried running it with both the Micro-USB Cable connected and disconnected. I realize that the vision system stops taking snapshots when you are on the live preview.

Here is the GEN2 Code just in case it changed.

// User defined function
void myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed(double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__x, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__y, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__h, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__w, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__heading, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__speed, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__newheading, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__newspeed);

int Brain_precision = 0, Console_precision = 0, Eye_objectIndex = 0;

float myVariable, objCount;

event purpleFoundMsg = event();
event orFoundMessage = event();
event message1 = event();

// User defined function
void myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed(double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__x, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__y, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__h, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__w, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__heading, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__speed, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__newheading, double myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__newspeed) {
  Brain_precision = 0;
  Brain.Screen.print("At ");
  Brain.Screen.print(printToBrain_numberFormat(), static_cast<float>(myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__x));
  Brain.Screen.print(" , ");
  Brain.Screen.print(printToBrain_numberFormat(), static_cast<float>(myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed__y));
  Brain.Screen.newLine();
}

// "when started" hat block
int whenStarted1() {
  Brain.Screen.clearScreen();
  Brain.Screen.setCursor(1, 1);
  Brain.Screen.print("Hello World!");
  Brain.Screen.newLine();
  return 0;
}

// "when started" hat block
int whenStarted2() {
  while (true) {
    Eye.takeSnapshot(Eye__PU);
    Brain.Screen.clearLine(2);
    Brain.Screen.clearLine(3);
    Brain.Screen.clearLine(4);
    Brain.Screen.clearLine(5);
    if (Eye.objectCount > 0) {
      orFoundMessage.broadcastAndWait();
    }
    else {
      Brain.Screen.setCursor(2, 1);
      Brain.Screen.print("Who took my block\?");
      Brain.Screen.newLine();
      wait(1.0, seconds);
    }
  wait(20, msec);
  }
  return 0;
}

// "when I receive orFoundMessage" hat block
void onevent_orFoundMessage_0() {
  Brain.Screen.setCursor(2, 1);
  Brain.Screen.print("I found  ");
  Brain.Screen.print(printToBrain_numberFormat(), static_cast<float>(Eye.objectCount));
  Brain.Screen.print("block(s).");
  Brain.Screen.newLine();
  myVariable = 1.0;
  repeat(Eye.objectCount) {
    Eye_objectIndex = static_cast<int>(myVariable) - 1;
    myVariable = myVariable + 1.0;
    myblockfunction_foundVision_x_y_h_w_heading_speed_newheading_newspeed(Eye.objects[Eye_objectIndex].centerX, Eye.objects[Eye_objectIndex].centerY, Eye.objects[Eye_objectIndex].width, Eye.objects[Eye_objectIndex].height, 1.0, 1.0, 1.0, 1.0);
    wait(20, msec);
  }
  wait(1.0, seconds);
}


int main() {
  // register event handlers
  orFoundMessage(onevent_orFoundMessage_0);

  wait(15, msec);
  vex::task ws1(whenStarted2);
  whenStarted1();
}

I had a quick look at this. I did not run your code due to not having the project configuration, but ran this simple test.

code
//----------------------------------------------------------------------------
//                                                                            
//    Module:       main.cpp                                                  
//    Author:       {author}                                                  
//    Created:      {date}                                                    
//    Description:  IQ project                                                
//                                                                            
//----------------------------------------------------------------------------

// Include the IQ Library
#include "iq_cpp.h"

vision::signature SIG_1 (1, 4699, 5807, 5252, -717, -75, -396, 3.000, 0);
vex::vision vision1 ( vex::PORT12, 50, SIG_1 );

// Allows for easier use of the VEX Library
using namespace vex;

int main() {
    this_thread::sleep_for(200);

    while(1) {
      int nObjects = vision1.takeSnapshot(SIG_1);

      if( nObjects > 0 ) {
        Brain.Screen.setCursor(1, 1);
        Brain.Screen.print("Objs %d       ", nObjects );
        Brain.Screen.setCursor(2, 1);
        Brain.Screen.print("X: %d Y: %d", vision1.largestObject.originX, vision1.largestObject.originY );
        Brain.Screen.setCursor(3, 1);
        Brain.Screen.print("W: %d H: %d", vision1.largestObject.width, vision1.largestObject.height );
      }
      else{
        Brain.Screen.setCursor(1, 1);
        Brain.Screen.print("No Objects" );
        Brain.Screen.clearLine(2);
        Brain.Screen.clearLine(3);
      }
      this_thread::sleep_for(20);
    }
}

same result (essentially) on both IQ Generation 1 and Generation 2 brains.

iq2_vision

1 Like

Also, be careful with this. Sometimes the vision sensor will not correctly communicate (actually enumerate which brain port it is on) if it is being powered from the USB cable when connected to the brain. Perhaps try you tests using only port 1 on the brain and see if you get the same result.

2 Likes

Thanks for the help. I had always had the micro-usb cable connected and that is why it would not enumerate when the Gen2 Fired up. Gen 1 did not seem to have the same issue. My code is now working on both my Gen 1 and Gen 2 Brain. Thanks for pointing out the issue of enumerating the ports on startup/power up.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.