Browsed by
Category: Technical

Tanya’s plan to avoid the fallacies of crunching and bad work habits

Tanya’s plan to avoid the fallacies of crunching and bad work habits

Tanya Short gave one of the best talks I’ve heard in a long time about the fallacies of crunching and bad work habits many people have. The video is now up for free at the GDCVault. Her talk starts at 6:00 :
http://gdcvault.com/play/1024174/Indie

 

When trying to hit deadlines, she starts out by observing that most of the time we think ‘getting X done’ is our highest priority. It’s not. It’s actually #3:

Your real priorities:

  1. Don’t burn out (i.e. don’t die)
  2. Always keep in mind you’re going to do another – and you should be excited to do the next one even better.
  3. Get it done

That sounds great, but it also sounds a bit idealistic. She says it is not easy, but lays out these points.

Step-by-step roadmap to not dying:

  1. Believe it is possible to hold those priorities in that order
    Many great studios work and ship games without crunch. It can be done, you just have to be disciplined.
  2. Stop working ‘all the time’. Set work hours.
    It is a fallacy to think working all the time is better. Especially in creative fields. Set work hours and stick to them.
  3. Prioritize your tasks and re-prioritize as often as needed.
    In order to hit your deadlines, you need to know what you’re working on RIGHT NOW is important, not just urgent. If you focus just on the ‘urgent’ emails/tasks/etc, then you’ll never get into the steady workflow that is what makes your work great.
  4. Estimate your tasks. Re-estimate when needed.
    When you finish your task, ask if it took the time you thought it would take. Get better at it.
  5. Cut the scope before you bleed out.
    If you’re 3 weeks out and realize you won’t make it, don’t immediately think about working more/harder/longer. 3 or more 60 hour weeks is scientifically less productive than 3 or more weeks of 40 hour weeks. You are doing worse work. Even if you think you are a special exception. Why can she say that? A study was done on 100 people that claimed they needed less than 7 hours of sleep. Only 5 out of the 100 could actually do it.
  6. Don’t give up – iterate steps 1-5 again and again
    These steps (production) is a skill. Skills can be developed. Skill development requires practice. So congratulate yourself when you do it pretty well, forgive and be kind to yourself when you don’t treat yourself as you deserve.
    We are primates. Primates need to be taken care of in a way computers and games don’t, so don’t act like that towards yourself. It’s not about how many hours you spend because everyone is different.

Other quotables:
A few long nights won’t kill you, but a few long months might. Especially if combined with other health and life factors.

Burnout is the feeling of being dulled as layer after layer of exhaustion accumulates. Burnout is the void left behind where your career could have been.

Then she has a real Benedictine moment: The moment right now will never come again. Every one of us will die. No matter what we create, all we have is right now. Don’t use up that joy, love, and creative energy you have by burning yourself out.

Keep death always before your eyes.
—St. Benedict: The Rules: Chapter 4.47


She doesn’t cite the studies, but I found some:

http://lifehacker.com/working-over-40-hours-a-week-makes-you-less-productive-1725646811

Set up VNC on Ubuntu 14.04

Set up VNC on Ubuntu 14.04

Setting up VNC on Ubuntu used to be pretty painless. But recent changes in Ubuntu and X have left it kind of a mess. It took me way longer to set up VNC than it should have, and finding the documentation wasn’t super-easy either. There were lots of broken guides. So, here’s what you need to do:

  • Follow these setup instructions first:
    https://www.howtoforge.com/how-to-install-vnc-server-on-ubuntu-14.04
  • When completed, however, a known issue means the screen will come up blue-grey and have few desktop controls if you try to connect to it. This is because (near as I can tell) the X manager currently used for Ubuntu doesn’t work over VNC anymore. You need to set VNC up to use an older desktop manager that
  • To fix that problem, you need to fix things according to this guide:
    http://onkea.com/ubuntu-vnc-grey-screen/
  • On your client, start the vncserver and connect to it by matching the final digit of the port number to the :X number you used to create it.
    • Example:
      host: vncserver :4 –geometry 800×600 (to create the server)
      client should use the ip: 10.23.47.150:5904
  • If you get an error starting the vncserver, increment the :2 to :3 or :4 and so forth until you find one not in use by some other user on the server.

OpenGL ES 2.0/3.0 Offscreen rendering with glRenderbuffer

OpenGL ES 2.0/3.0 Offscreen rendering with glRenderbuffer

Rendering to offscreen surfaces is a key component to any graphics pipeline. Post-processing effects, deferred rendering, and newer global illumination strategies all use it. Unfortunately, implementing offscreen rendering on OpenGL ES is not well documented. OpenGL ES is often used on embedded/mobile devices, and until recently, these devices haven’t typically had the graphics bandwidth to keep up with new rendering techniques. Compound this with the fact that many mobile games have simple gameplay/small screens that do not need such complex lighting models, many people now use off the shelf engines for their games, and that there is still a good amount of mobile hardware out there that doesn’t even support render to offscreen surfaces, and it is no surprise that few people use the technique and it’s not well discussed.

In implementing offscreen rendering for OpenGL ES, I turned to the very good OpenGL ES Programming book as it has a whole chapter on framebuffer objects. When I tried the samples in the book, however, I was having a lot of difficulty getting it working on my linux-based mobile device. A lot of the implementation examples use a technique of creating framebuffer objects using textures, but you can also use framebuffer objects via something called render buffers. One reason this is good to know is because many hardware vendors support very few render-to-texture formats. You can often find yourself struggling with your implementation not working because the output formats aren’t supported.

Thankfully, I found this article and thought I’d copy the information here since it’s the only place I’ve seen working code that demonstrated the technique. It also includes the very important step of reading the output format and uses glReadPixels() so you can validate that you were writing correctly to the offscreen renderbuffer surface.

In my case, on an Intel graphics part, I found that the format (which is also the most recommended one) that worked  was GL_RGB/GL_UNSIGNED_SHORT_5_6_5. Steps 1-8 is standard OpenGL ES setup code that is included so you can verify your setup. Step 9 is where the glFrameBuffer and glRenderBuffer objects are created.

 

    #define CONTEXT_ES20

    #ifdef CONTEXT_ES20
        EGLint ai32ContextAttribs[] = { EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE };
    #endif

    // Step 1 - Get the default display.
    EGLDisplay eglDisplay = eglGetDisplay((EGLNativeDisplayType)0);

    // Step 2 - Initialize EGL.
    eglInitialize(eglDisplay, 0, 0);

    #ifdef CONTEXT_ES20
    // Step 3 - Make OpenGL ES the current API.
    eglBindAPI(EGL_OPENGL_ES_API);

    // Step 4 - Specify the required configuration attributes.
    EGLint pi32ConfigAttribs[5];
    pi32ConfigAttribs[0] = EGL_SURFACE_TYPE;
    pi32ConfigAttribs[1] = EGL_WINDOW_BIT;
    pi32ConfigAttribs[2] = EGL_RENDERABLE_TYPE;
    pi32ConfigAttribs[3] = EGL_OPENGL_ES2_BIT;
    pi32ConfigAttribs[4] = EGL_NONE;
    #else
    EGLint pi32ConfigAttribs[3];
    pi32ConfigAttribs[0] = EGL_SURFACE_TYPE;
    pi32ConfigAttribs[1] = EGL_WINDOW_BIT;
    pi32ConfigAttribs[2] = EGL_NONE;
    #endif

    // Step 5 - Find a config that matches all requirements.
    int iConfigs;
    EGLConfig eglConfig;
    eglChooseConfig(eglDisplay, pi32ConfigAttribs, &eglConfig, 1,
                                                    &iConfigs);

    if (iConfigs != 1) {
        printf("Error: eglChooseConfig(): config not found.n");
        exit(-1);
    }

    // Step 6 - Create a surface to draw to.
    EGLSurface eglSurface;
    eglSurface = eglCreateWindowSurface(eglDisplay, eglConfig,
                                  (EGLNativeWindowType)NULL, NULL);

    // Step 7 - Create a context.
    EGLContext eglContext;
    #ifdef CONTEXT_ES20
        eglContext = eglCreateContext(eglDisplay, eglConfig, NULL,
                                               ai32ContextAttribs);
    #else
        eglContext = eglCreateContext(eglDisplay, eglConfig, NULL, NULL);
    #endif

    // Step 8 - Bind the context to the current thread
    eglMakeCurrent(eglDisplay, eglSurface, eglSurface, eglContext);
    // end of standard gl context setup

    // Step 9 - create framebuffer object
    GLuint fboId = 0;
    GLuint renderBufferWidth = 1280;
    GLuint renderBufferHeight = 720;

    // create a framebuffer object
    glGenFramebuffers(1, &fboId);
    glBindFramebuffer(GL_FRAMEBUFFER, fboId);

    // create a texture object
    // note that this is commented out/not used in this case but is
    // included for completeness/as example
    /*  GLuint textureId;
     glGenTextures(1, &textureId);
     glBindTexture(GL_TEXTURE_2D, textureId);
     glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
     glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);                             
     //GL_LINEAR_MIPMAP_LINEAR
     glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
     glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
     glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP_HINT, GL_TRUE); // automatic mipmap
     glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, renderBufferWidth, renderBufferHeight, 0,
                  GL_RGB, GL_UNSIGNED_BYTE, 0);
     glBindTexture(GL_TEXTURE_2D, 0);
     // attach the texture to FBO color attachment point
     glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
                         GL_TEXTURE_2D, textureId, 0);
     */
     qDebug() << glGetError();
     GLuint renderBuffer;
     glGenRenderbuffers(1, &renderBuffer);
     glBindRenderbuffer(GL_RENDERBUFFER, renderBuffer);
     qDebug() << glGetError();
     glRenderbufferStorage(GL_RENDERBUFFER,
                           GL_RGB565,
                           renderBufferWidth,
                           renderBufferHeight);
     qDebug() << glGetError();
     glFramebufferRenderbuffer(GL_FRAMEBUFFER,
                               GL_COLOR_ATTACHMENT0,
                               GL_RENDERBUFFER,
                               renderBuffer);

      qDebug() << glGetError();
      GLuint depthRenderbuffer;
      glGenRenderbuffers(1, &depthRenderbuffer);
      glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
      glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16,     renderBufferWidth, renderBufferHeight);
      glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderbuffer);

      // check FBO status
      GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
      if(status != GL_FRAMEBUFFER_COMPLETE) {
          printf("Problem with OpenGL framebuffer after specifying color render buffer: n%xn", status);
      } else {
          printf("FBO creation succeddedn");
  }

  // check the output format
  // This is critical to knowing what surface format just got created
  // ES only supports 5-6-5 and other limited formats and the driver
  // might have picked another format
  GLint format = 0, type = 0;
  glGetIntegerv(GL_IMPLEMENTATION_COLOR_READ_FORMAT, &format);
  glGetIntegerv(GL_IMPLEMENTATION_COLOR_READ_TYPE, &type);

  // clear the offscreen buffer
  glClearColor(1.0,0.0,1.0,1.0);
  glClear(GL_COLOR_BUFFER_BIT);

  // commit the clear to the offscreen surface
  eglSwapBuffers(eglDisplay, eglSurface);

  // You should put your own calculation code here based on format/type
  // you discovered above
  int size = 2 * renderBufferHeight * renderBufferWidth;
  unsigned char *data = new unsigned char[size];
  printf("size %d", size);

  // in my case, I got back a buffer that was RGB565
  glReadPixels(0,0,renderBufferWidth,renderBufferHeight,GL_RGB, GL_RGB565, data);

  // Check output buffer to make sure you cleared it properly.
  // In 5-6-5 format, clearing to clearcolor=(1, 0, 1, 1)
  // you get 1111100000011111b = 0xF81F in hex
  if( (data[0] != 0x1F) || (data[1] != 0xF8))
      printf("Error rendering to offscreen buffern");

  QImage image(data2, renderBufferWidth,  renderBufferHeight,renderBufferWidth*2, QImage::Format_RGB16);
  image.save("result.png");
New AI just ‘decisively’ beat pro poker players in 7 day tourney and demonstrates mastery of imperfect information games

New AI just ‘decisively’ beat pro poker players in 7 day tourney and demonstrates mastery of imperfect information games

Developed by Carnegie Mellon University, a new AI called Libratus won the “Brains Vs. Artificial Intelligence” tournament against four poker pros by $1,766,250 in chips over 120,000 hands (games). Researchers can now say that the victory margin was large enough to count as a statistically significant win, meaning that they could be at least 99.7 percent sure that the AI victory was not due to chance.

The four human poker pros who participated in the recent tournament spent many extra hours each day on trying to puzzle out Libratus. They teamed up at the start of the tournament with a collective plan of each trying different ranges of bet sizes to probe for weaknesses in the Libratus AI’s strategy that they could exploit. During each night of the tournament, they gathered together back in their hotel rooms to analyze the day’s worth of plays and talk strategy.

The AI took a lead that was never lost. It see-sawed close to even mid-week and even shrunk to $50,000 on the 6th day. But on the 7th day ‘the wheels came off’. By the end, Jimmy Chou, became convinced that Libratus had tailored its strategy to each individual player. Dong Kim, who performed the best among the four by only losing $85,649 in chips to Libratus, believed that the humans were playing slightly different versions of the AI each day.

After Kim finished playing on the final day, he helped answer some questions for online viewers watching the poker tournament through the live-streaming service Twitch. He congratulated the Carnegie Mellon researchers on a “decisive victory.” But when asked about what went well for the poker pros, he hesitated: “I think what went well was… shit. It’s hard to say. We took such a beating.”

The victory demonstrates the AI has likely surpassed the best humans at doing strategic reasoning in “imperfect information” games such as poker. But more than that, Libratus algorithms can take the “rules” of any imperfect-information game or scenario and then come up with its own strategy. For example, the Carnegie Mellon team hopes its AI could design drugs to counter viruses that evolve resistance to certain treatments, or perform automated business negotiations. It could also power applications in cybersecurity, military robotic systems or finance.

http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/ai-learns-from-mistakes-to-defeat-human-poker-players

 

Fingerprints are not security

Fingerprints are not security

Jan Krissler, known in hacker circles as Starbug, was already known for his high-profile stunt of cracking Apple TouchID sensors within 24 hours of the iPhone 5S release. In this case, he used several easily taken close-range photos of German defense minister Ursula von der Leyen, including one gleaned from a press release issued by her own office and another he took himself from three meters away, to reverse-engineer her fingerprint and pass biometric scans.

The same conference also demonstrated a “corneal keylogger”. The idea behind the attack is simple. A hacker may have access to a user’s phone camera, but not anything else. How to go from there to stealing all their passwords?

One way, demonstrated on stage, is to read what they’re typing by analyzing photographs of the reflections in their eyes. Smartphone cameras, even front-facing ones, are now high-resolution enough that such an attack is possible.

“Biometrics are not secrets… Ideally, they’re unique to each individual, but that’s not the same thing as being a secret.”

https://www.theguardian.com/technology/2014/dec/30/hacker-fakes-german-ministers-fingerprints-using-photos-of-her-hands

PIX for Windows is back!

PIX for Windows is back!

PIX is a performance tuning and debugging tool for game developers – that hadn’t been updated in years for the desktop. It survived on in three generations of Xbox consoles, but there was no desktop love. No longer! Microsoft just announced PIX beta is now available for analyzing DirectX 12 games on Windows.

PIX on Windows provides five main modes of operation:

  • GPU captures for debugging and analyzing the performance of Direct3D 12 graphics rendering.
  • Timing captures for understanding the performance and threading of all CPU and GPU work carried out by your game.
  • Function Summary captures accumulate information about how long each function runs for and how often each is called.
  • Callgraph captures trace the execution of a single function.
  • Memory Allocation captures provide insight into the memory allocations made by your game.

Go to the Microsoft blog to download it for free.

 

scp without entering a password each time

scp without entering a password each time

Lets say you want to copy between two hosts host_src and remote_machine. host_src is the host where you would run  scp, ssh or rsyn, irrespective of the direction of the file copy.

  1. On host_src, run this command as the user that runs scp/ssh/rsync$ ssh-keygen -t rsaThis will prompt for a passphrase. Just press the enter key. If you assign a password to the key, then you’ll need to enter it each time you scp. It will then generate a private key and a public key. ssh-keygen shows where it saved the public key. This is by default ~/.ssh/id_rsa.pub:

    Your public key has been saved in <your_home_dir>/.ssh/id_rsa.pub

  1. Transfer the id_rsa.pub file to host_dest by either ftp, scp, rsync or any other method.
  1. On remote_machine, login as the remote user which you plan to use when you run scp, sshor rsync on host_src.
  2. Copy the contents of id_rsa.pub to ~/.ssh/authorized_keys
    $ cat id_rsa.pub >>~/.ssh/authorized_keys
    $ chmod 700 ~/.ssh/authorized_keys

If this file does not exists, then the above cat command will create it. Make sure you remove permission for others to read this file via chmod. If its a public key, why prevent others from reading this file? Probably, the owner of the key has distributed it to a few trusted users and has not placed any additional security measures to check if its really a trusted user.

Optional – allowing root to ssh:

  1. ssh by default does not allow root to log in. This has to be explicitly enabled on remote_machine. This can be done by editing /etc/ssh/sshd_config and changing the option of PermitRootLogin from no to yes.
  2. Don’t forget to restart sshd so that it reads the modified config file.
  3. Do this only if you want to use the root login.

Thats it. Now you can run scp, ssh and rsync on host_src connecting to remote_machine and it won’t prompt for the password. Note that this will still prompt for the password if you are running the commands on host_dest connecting to host_src. You can reverse the steps above (generate the public key on remote_machine and copy it to host_src) and you have a two way setup ready!

Connecting iSpy to an Amcrest IP2M-841 webcam

Connecting iSpy to an Amcrest IP2M-841 webcam

This was more annoying than it should have been. When setting up my Amcrest IP2M-841B camera, I was able to use the Amcrest IP Config tool to log in and watch my camera without issue.

When using iSpy 64, however, the darn thing couldn’t figure out how to connect to it. Here’s how I did it. I left the camera on channel 1, set the encoding to plain H.264, and then did the following.

Test your camera using Amcrest IP config.

The first thing is to make sure your camera is working at all:

  1. Be sure you can open the IP config tool and see your cameras.
  2. Make sure passwords are correct, you can get a live view, and that it’s set to H.264 encoding and the channels are correct.

 

Test your rtsp line using VLC:

  1. Open VLC (install it if you need)
  2. Media->Open Network Stream
  3. Copy in your rtsp: address
    1. example without the username/password:
      1. rtsp://192.168.1.99:554/cam/realmonitor?channel=1&subtype=0
      2. VLC will ask for your username/password and you can enter it.
    2. example with the username/password:
      1. rtsp://<username>:<password>@<ipaddress>:554/cam/realmonitor?channel=1&subtype=0
      2. I left the arguments as –rtsp-caching=100 (the default)
  4. You should see your stream come up in VLC
  5. NOTE: When setting your password, if you have any special characters like %!&#$ – or the like – be sure to convert them to their equivalent hex ASCII codes. See this chart here.
    1. Example: if your password is ‘cat&dog’, you should use the password: ‘cat%26dog’

Connecting to iSpy

If connecting via VLC worked, your 75% of the way there.

  1. Start iSpy
  2. Add->IP Camera
  3. Select the VLC Plugin tab (I have VLC installed, not sure if that’s 100% necessary)
  4. Set the VLC URL to what you had above (with the username+password):
    rtsp://<username>:<password>@<ipaddress>:554/cam/realmonitor?channel=1&subtype=0
  5. When setting the password, if you have any special characters like %!&%#$ – or the like – be sure to convert them to their equivalent hex ascii codes. See this chart here.
    1. Example: if your password is ‘cat&dog’, you should use the password: ‘cat%26dog’
  6. I left the arguments as –rtsp-caching=100 (the default)

Once you have iSpy connected, you can set up events and connect to the cloud for full web monitoring.

 

Resources:

So, where did I get that rtsp line? Directly from the Amcrest HTTP API SDK Protocol Specification. Section 4.1.1, p14 – Get real-time stream. It’s also a handy guide on all the other parameters you can send the camera.