Quantcast
Channel: All LabVIEW posts
Viewing all 203588 articles
Browse latest View live

Re: 2019 SP1 disappointment

$
0
0

 wrote:

Did you actually try to upgrade and got an error because of that?


I did not try to upgrade, but if something goes wrong, what is my recourse?  It would be simple enough for NI to say I don't meet minimum requirements, so they can't help me.  And if NIPM is still in such a state where installing on top of other versions is still an issue, it can be a real problem for me.


Re: MOXA, RS-485, ModBus RTU

Re: Issue with using python functions

$
0
0

 wrote:

 

Pretty sure it's something to do with the Python version.

 

Which did you install? And how (Anaconda, spyder, PyCharm, etc.)? Does Python itself work? Do you have other Python versions installed? And so on...

 


 

I'm using Python 3.6.6 Windows x86 embeddable zip file  (I have LABVIEW 2018 32-bit version installed on my machine) which I downloaded from https://www.python.org/downloads/release/python-366/

 

I've put the python folder in my C drive. and yes the python itself works...

Re: Causes of Clock drift and return?

$
0
0

Maybe Windows is NOT a Real Time Operating System?

Re: How do I create a start/stop button for each separate while loop within my program, when each of them does a different task?

$
0
0

Hey

Thank you

 

Right now I am using a state machine to make timed logging of data ,

But,wanted to know how to Count the number of Elements in each column in arrray  (say 2d ).

I want to compare the number of Elements which i get for  particular seconds and stop and start the timer 

and for this time I am logging the data.

Re: Causes of Clock drift and return?

$
0
0

I dont think thats the cause? Wouldn't that cause more than just occasional ~50 ns second delays? Wouldn't that be on the microsecond scale?

 

Also, we're not having it interact with Windows at all. We're having the cables go to the 6674T (one to the backplane to synchronize and the other to the flexRio). (The FlexRio takes the timestamp via an ETI before sending it to the vi, not the windows OS).

Re: Possible to pass objects between LV and other languages?

$
0
0

 wrote:

Well if you don't convert the JSON then yes, it is a string basically. But you still can't do to much with it. A string object in Python has a number of specific methods and properties and that's mostly it. To convert it into a Python object with similar functionality as the LabVIEW object you would need to reprogram the object in Python and add a Serialize/Deserialize method from a LabVIEW JSON String. Very possible but also very cumbersome especially if you want to do more than one or two object classes.


The de-serializing of JSON in Python is easy. But that converts the JSON to aPython object.

 

Passing an object (data and functionality) is simply not possible. Passing data is very well possible.

 

I agree, converting the LabVIEW object to a Python object with the same functionality will only be practical for a few objects.

 

Not sure what OP wants with the 'objects'? 

 

>Possible to pass objects between LV and other languages?

Possible, sure. Practical, maybe.

Re: Issue with using python functions

$
0
0

did you :

(Windows) If you install Python 3.6, add the directory containing python36.dll to the system path

 

As indicated here.


Trying to merge large files

$
0
0

Hi, 

 

I am trying to merge several large files into a single file. Each of the files contains several hundred Megabytes of Data in several columns of Tap deliminated txt files. 

 

If I just try to open them, merge them in a concatenating for loop and writing a new file I constantly get the error not enough memory.

 

What is an easy way to merge several files into one, while avoiding the memory overload issue? Each file contains 4 columns with millions of rows of data, the final file should have the same amount of rows as previously but each column added at the end.

 

Thank you for your help

Re: 3D representation problems with some new graphical cards

$
0
0

 wrote:

the problem is only with the 3D accelerator enabled.

 

Any suggestion ?

 


Disable 3D acceleration, as I don't think LabVIEW really benefits from it anyway.

Re: Reading current of ELGAR CW2501 AC Power Source

$
0
0

The missing question mark is definitely suspect. I would try to add that. A SCPI complying device should not return any data without the question mark at the end. The manual may be a bit misleading here since they try to show the command syntax in a hierarchical structure. The CURR word should have a question mark added only for the standard query, not when used with the more specific queries that follow.

Re: Preallocate array memory without initializing it

$
0
0

What you *want* is something like a 2D array but where rows can have different lengths.

 

You cannot have that in LabVIEW, not directly.  You've already tried one option via queues -- each row becomes its own 1D array wrapped up inside a cluster, then an array of those clusters represents your "ragged" 2D array.   But you don't like that option.

 

Here's another, though I can't really say whether it'll work out better for you or not.  Maintain 2 distinct 1D arrays.  The data array simply keeps appending your 1D chunks into a larger 1D array.  The other array is populated with the lengths of each of those chunks.  The two together can be used to extract the data in their original chunk sizes, and your consumer code can do whatever it needs to do to handle these variable-length chunks.

 

 

-Kevin P

Re: Trying to merge large files

$
0
0

Adding columns to a text file is difficult to impossible.

 

If you must use a text file then I suggest(it may not work),

open 1 file convert to binary, close file, open second file convert to binary append to binary array, repeat for third file.

 

Now you have a binary array of points, convert it to text in chunks, say 100k rows at a time. Append each chunk to the new text file.

 

mcduff

Re: PID loop in FPGA Issue

$
0
0

I'm currently trying to remove the while loop inside the case structure. So one while loop around the case structure with just the code wide open inside the case structure. Hopefully after compiling this solves the issue.

Re: Causes of Clock drift and return?

$
0
0

10 Mhz backplane clock and 52ns difference? Sounds like a perfect half phase shift, so the trigger polarity may have been setup wrong on one side or there maybe some unintended clock inversion somehow.


Re: Causes of Clock drift and return?

$
0
0

interesting. I will delve down deeper into this. What throws a wrench in this/complicates this is that its not constant. It occurs at certain times/luck of the draw of when I take timestamps.

Re: Causes of Clock drift and return?

$
0
0

Just guessing, this is @kevin_price territory.

 

  1. You have sample clocks that are synced to the 10 MHz backplane, every now and then I assume they need to re-sync to the clock, there may be some jitter involved in this.
  2. Check the cable connecting the chassis, depending on its length, condition, there may be some odd degradation in the signal that leads to jitter.

mcduff

Re: Preallocate array memory without initializing it

$
0
0

 wrote:

What you *want* is something like a 2D array but where rows can have different lengths.


This can also be done with Maps starting in LabVIEW 2019.  You can have the iteration index be the key and then the array is the data.

Re: 3D representation problems with some new graphical cards

$
0
0

Too bad you didn't send LabVIEW code that could create your "example" plot.  We could have tested it on a variety of machines -- maybe it's "fixable" by "adjusting" (fixing) the Software or tweaking the Hardware.  Oh, well ...

 

Bob Schor

Visa Read Error - Keithley 2260B-800-1 360W

$
0
0

Hello Everyone,

 

I know VISA read errors occur a lot on these forums, but I am really struggling understanding what's going on with our system.  The problem seems to be very intermittent.

 

I am using a Keithley 2260B-800-1 360 W instrument, and I am connecting to it by TCP/IP.  In NI MAX, when I open the test panel, I have enabled the termination character, and I get a response with an *IDN? query.  I do this with either the Query or with the Write and then Read.  I've updated the Bytes in the test panel to 256 like driver does, and it still works just fine.

 

I've used drivers from https://www.tek.com/dc-power-supply/2260b-30-36-software-1, and I have used them successfully.  Suddenly though, there is a VISA Read Error.  I am using the Initalize VI.  I've not changed anything with this code.

 

Why is it that it is running fine with the NI MAX Open VISA Test panel, but the visa read is giving me a problem in the driver?

 

 

Viewing all 203588 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>