Quantcast
Channel: All LabVIEW posts
Viewing all 203560 articles
Browse latest View live

Re: Converting array of Double Precision values to U16 for MODBUS

$
0
0

Hi RavensFan,

 

Your solution worked, but I lost some values. The machine which we are monitoring, send to us 29 values (sensors from temperature and pressure)

My VI is attached for you verify (before your solution).

 

Thank you so much for your attention.


Rif.: ni-visa on ubuntu 18.04 (64 bit)

$
0
0

same issue for me.

 

When I run the visaconf command, after a bit I get this reply on the terminal

 

libnipalu.so failed to initialize
Verify that nipalk.ko is built and loaded.
Aborted (core dumped)

 

has anyone solved this unexpected error?

Pass all diagram propertys

$
0
0

Hello everyone,

 

I am trying to add a control for user friendly diagram options. I do this using a seperate VI where I wire all necessary data (signal, options, scale, etc....) into it. 

When I create the output the properties set in the subvi are not passed (visibility, names). 

Is there a way to pass all the properties to a Diagram?

 

I tried it with passing the reference, but I do not want to connect all properties manually.

Re: Issue with using python functions

$
0
0

 wrote:

 wrote:

There's at least some information on error 1663:

https://forums.ni.com/t5/LabVIEW/Labview-2018-Python-Node-and-Anaconda-Environment/td-p/3853701


yeah ... thanks....

 

They do not appear to have a clear solution either....

 

Any other ideas how to resolve the issue I'm facing?

I still am having trouble understanding why I get this error code... perhaps if there is a good explanation if what triggers it, it might help me understand what might be causing it at my end! Smiley Happy


Not really.

 

Pretty sure it's something to do with the Python version.

 

Which did you install? And how (Anaconda, spyder, PyCharm, etc.)? Does Python itself work? Do you have other Python versions installed? And so on...

 

Googling the hex error code gives more results:

https://www.google.com/search?q=labview+python+error+0x67F

Re: Using Refrences with Express VI's.

$
0
0

If you open LabVIEW, click File, New ... (the three dots are important), then choose the From Template, Framework, Design Patterns, Producer/Consumer Design Pattern (Data).

 

This will show you a Producer, where data are Generated (think a DAQmx structure with a Start DAQ function to the left of the While Loop, a Read DAQ inside, with the data wired to the Queue (make the Queue type the same as the DAQmx output, e.g. 2D Array of Dbl, 1D Array of Waveform, etc.), and Stop DAQ function when the loop exits.

 

There is a slightly "bad" aspect to this Template.  You should (probably) not stop a Producer/Consumer Design by having the Producer kill the Queue, thereby causing an Error to be generated in the Consumer (because there's no Queue and you are trying to do a Dequeue).  Yes, this can be used to mean "The Producer has quit", but means you can't meaningfully use the Error line in the Consumer "downstream", such as doing something "inappropriate" in processing the data.

 

A "better" way is to pass a "Stop" signal from the Producer to the Consumer.  For example, if you are sending data from DAQmx, and your Queue element is an Array, you could send an empty Array to the Consumer when the Producer exited (replacing the Release Queue).  The Consumer, when it dequeued the data, would add a test for an Empty Array (which takes essentially 0 time) and use the "True" case to mean "Time to Exit", and "False" to mean "Process the data".  Now, when the Consumer exits, it knows (a) that the Producer has finished, and no more elements are coming from it, and therefore (b) it is safe, once it exits its While loop, for the Consumer to Release the Queue.

 

Bob Schor

Re: Data Types: String vs. SubString

$
0
0

I would suppose that the quick fix would be simple. Fix whatever messup the TDMS input flag indication for that parameter causes. There might be complications though as that quick fix might disable several potential optimizations. So a true fix is likely a much more complex adventure (with various potential side effects that could cause new problems elsewhere).

Using LabVIEW with Canberra DSA-1000 MCA

$
0
0

I need to use LabVIEW to automate control and data access to a Canberra DSA-1000 MCA.  I have created a NI-VISA driver to talk to the device, but I am having some issues getting it to respond to the command I am sending it.

 

I am sending what I believe is a properly formatted message to it.  I am using command 0x39, Ret Status.  This was the simplest command I could see in the protocol that would return something I could just read as a first test.

 

I have attached screen shots of the USB decoding on the oscilloscope.  It appears I get an ACK response from the instrument, but I don't get any data back from it.

 

If anyone out there has experience with this instrument and using LabVIEW with it please let me know if you can help.

 

Thanks

Joe

Re: Multicolumn Listbox scroll bar woes

$
0
0

 wrote:

 

Did you try to customize using  the control editor?

 

Ben


Except for simple booleans I tend to stay away from the control editor, too much frustration.

 

mcduff


Re: Converting array of Single Precision values to U16 for MODBUS

$
0
0

You need to use Swap Words,  not Swap Bytes.  Note that mine shows "16" on the function while yours show "8".

 

You are grabbing 29 registers.  That means 14 and a half "pairs" of registers when you combine them into 32 bit integers.

If you are trying to grab 29 single precison floats, then you need to wire in 58 so that 58 registers get paired into 29 single precision floats.

 

You wanted single precision floats, so you still need to do that typecast.

 

Pro Tip:  Delete your 300 constant from the wait function.  Then right click on the input and use Create Constant.  Now you'll have the right datatype and will get rid of that red coercion dot.

3D representation problems with some new graphical cards

$
0
0

Hi to all !

We developed an application software in the medical field with LV 2012 (with the related App builder to obtain the .exe and Installer) that accompanies our instruments. The standard procedure provides that our customers install the software in their Pcs to use our devices: up to now in the last 4/5 years no problems were reported about a 3D representation of signals that is included in the software. Recently we experimented a bad representation of 3D images on some graphical cards. Considering that the type of problem arises with the “3D Graph”, I prepared a test software with the “Native 3D graph example.vi” (National Instruments\LabVIEW 2012\examples\general\graphs\Native 3D graph example.vi).

 

Attached 3 images:

3D OK: correct representation.

3D Bad: bad representation with black background.

3D Bad with rendering window: by right click on the 3D graph => render window => the rendering window is ok in the same PC were the 3D Graph image is bad (black background).

 

At the moment we experienced problems on Intel 620 and Intel 630 graphic cards while on iMac with AMD Radeon R9 M390 2 GB + Virtualbox for Win 10 / 64 bit the problem is only with the 3D accelerator enabled.

 

We contacted also Intel and sent them the test software to obtain some help and we are waiting…

 

Any suggestion ?

Many thanks for your support

 

Giuseppe

Re: Pass all diagram propertys

$
0
0

My thinking was wrong. 

 

I pass the reference from the original Diagram (on the Main-VI) as an Input and not  try to pass it as an Output after all the manipulation

Re: Show C Series input module value on waveform.

$
0
0

I have added the OPEN reference fpga vi but i am not able to add FIFO Read to read values provided by fifo write in fpga vi.

Kindly have a look at code.

Thank you.

Re: Possible to pass objects between LV and other languages?

$
0
0

 wrote:

 

Python for instance will gladly receive a JSON string, and convert it to an object. It doesn't really care about it's content...


But it can't do much with such an object! The contents is just a binary stream that has no functional meaning to Python.

Re: What setting to set to stop .NET dll being copied to build folder

$
0
0

 wrote:

I don't think that applies to .NET dlls, Christian.

 

IIRC by default, ,net dlls don't get copied to executables. Unless you add the .dll to a class.

 

If you added the dll to the class, maybe not doing that will help. I'm not sure how that works with packet libraries. I've been able to avoid them so far.


lvclass and lvlib are mostly internchangable. If you add the .Net DLL explicitly to the lvlib it will always be copied to the lvlibp folder too. lvlibp is simply a lvlib with the diagrams removed (by default) and full compilation applied. And an lvlibp is read only! 

Re: Possible to pass objects between LV and other languages?

$
0
0

 wrote:

 wrote:

 

Python for instance will gladly receive a JSON string, and convert it to an object. It doesn't really care about it's content...


But it can't do much with such an object! The contents is just a binary stream that has no functional meaning to Python.


A JSON string or XML string isn't binary?

 

I got JSON strings from REST, and Python can work with it just fine.

 

Obviously a Flattened to JSON LV class it's not a class object in Python, until you set the data to the Python class. But there's no reason you can't make that work.


PID loop in FPGA Issue

$
0
0

Running a speed and torque PID loops controlling a valve. The false case for Mode S A runs the torque PID correctly. But when the true case for Mode S A runs I get zeros across all those indicators I slapped in there(cause you cant use probs in FPGA). When it was just speed PID by it's self it was working correctly, am I missing something small involving the case structure? But then I would think both cases wouldn't work correctly. Let me know your thoughts 

Re: Possible to pass objects between LV and other languages?

$
0
0

Well if you don't convert the JSON then yes, it is a string basically. But you still can't do to much with it. A string object in Python has a number of specific methods and properties and that's mostly it. To convert it into a Python object with similar functionality as the LabVIEW object you would need to reprogram the object in Python and add a Serialize/Deserialize method from a LabVIEW JSON String. Very possible but also very cumbersome especially if you want to do more than one or two object classes.

Causes of Clock drift and return?

$
0
0

Hello,

 

I am running 'vi's that takes timestamps between two chassis running at the same frequency. I have the 10MHz backplane of one being sent to the other via a timing module to synchronize. I have a trigger from a third chassis that is sent to both to take a timestamp (it is length matched to both).

 

When I compare multiple timestamps between them, for some reason, I'll occasionally get a drift of a consistent nanosecond amount (52ns). Other times it'll be exact (0 time difference). Does anyone know what could be causing this behavior as well as how to fix this?

 

Thank you

Re: Show C Series input module value on waveform.

$
0
0

In the RT target, there should be an FPGA palette.  In there should be a Method node, or maybe even a FIFO Read node (it's been a little while since I was heavy in cRIO).

Re: 2019 SP1 disappointment

$
0
0

I too am interested if there are any actual installation problems. 1803 is end of service for Home, Pro, and Pro for Workstations as of November 12 so my suspicion is that the officially listed supported OS is trying to keep up with what Microsoft is going to support.

Viewing all 203560 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>