sentence1
stringlengths
52
3.87M
sentence2
stringlengths
1
47.2k
label
stringclasses
1 value
def makeHist(x_val, y_val, fit=spline_base.fit2d, bins=[np.linspace(-36.5,36.5,74),np.linspace(-180,180,361)]): """ Constructs a (fitted) histogram of the given data. Parameters: x_val : array The data to be histogrammed along the x-axis. y_val : array ...
Constructs a (fitted) histogram of the given data. Parameters: x_val : array The data to be histogrammed along the x-axis. y_val : array The data to be histogrammed along the y-axis. fit : function or None, optional The function to use in order to fi...
entailment
def firstSacDist(fm): """ Computes the distribution of angle and length combinations that were made as first saccades Parameters: fm : ocupy.fixmat The fixation data to be analysed """ ang, leng, ad, ld = anglendiff(fm, return_abs=True) ...
Computes the distribution of angle and length combinations that were made as first saccades Parameters: fm : ocupy.fixmat The fixation data to be analysed
entailment
def trajLenDist(fm): """ Computes the distribution of trajectory lengths, i.e. the number of saccades that were made as a part of one trajectory Parameters: fm : ocupy.fixmat The fixation data to be analysed """ trajLen = np.roll(fm.fix, 1...
Computes the distribution of trajectory lengths, i.e. the number of saccades that were made as a part of one trajectory Parameters: fm : ocupy.fixmat The fixation data to be analysed
entailment
def reshift(I): """ Transforms the given number element into a range of [-180, 180], which covers all possible angle differences. This method reshifts larger or smaller numbers that might be the output of other angular calculations into that range by adding or subtracting 360, respectively. To...
Transforms the given number element into a range of [-180, 180], which covers all possible angle differences. This method reshifts larger or smaller numbers that might be the output of other angular calculations into that range by adding or subtracting 360, respectively. To make sure that angular data...
entailment
def initializeData(self, fit = None, full_H1=None, max_length = 40, in_deg = True): """ Prepares the data to be replicated. Calculates the second-order length and angle dependencies between saccades and stores them in a fitted histogram. Parameters: ...
Prepares the data to be replicated. Calculates the second-order length and angle dependencies between saccades and stores them in a fitted histogram. Parameters: fit : function, optional The method to use for fitting the histogram full_H1 : twod...
entailment
def _calc_xy(self, xxx_todo_changeme, angle, length): """ Calculates the coordinates after a specific saccade was made. Parameters: (x,y) : tuple of floats or ints The coordinates before the saccade was made angle : float or int Th...
Calculates the coordinates after a specific saccade was made. Parameters: (x,y) : tuple of floats or ints The coordinates before the saccade was made angle : float or int The angle that the next saccade encloses with the horizonta...
entailment
def _draw(self, prev_angle = None, prev_length = None): """ Draws a new length- and angle-difference pair and calculates length and angle absolutes matching the last saccade drawn. Parameters: prev_angle : float, optional The last angle that was drawn in the ...
Draws a new length- and angle-difference pair and calculates length and angle absolutes matching the last saccade drawn. Parameters: prev_angle : float, optional The last angle that was drawn in the current trajectory prev_length : float, optional ...
entailment
def sample_many(self, num_samples = 2000): """ Generates a given number of trajectories, using the method sample(). Returns a fixmat with the generated data. Parameters: num_samples : int, optional The number of trajectories that shall be generated. ...
Generates a given number of trajectories, using the method sample(). Returns a fixmat with the generated data. Parameters: num_samples : int, optional The number of trajectories that shall be generated.
entailment
def sample(self): """ Draws a trajectory length, first coordinates, lengths, angles and length-angle-difference pairs according to the empirical distribution. Each call creates one complete trajectory. """ lenghts = [] angles = [] coordinates = [] ...
Draws a trajectory length, first coordinates, lengths, angles and length-angle-difference pairs according to the empirical distribution. Each call creates one complete trajectory.
entailment
def drawFrom(self, cumsum, r): """ Draws a value from a cumulative sum. Parameters: cumsum : array Cumulative sum from which shall be drawn. Returns: int : Index of the cumulative sum element drawn. """ a = cumsum.rsplit(...
Draws a value from a cumulative sum. Parameters: cumsum : array Cumulative sum from which shall be drawn. Returns: int : Index of the cumulative sum element drawn.
entailment
def load(path): """ Load fixmat at path. Parameters: path : string Absolute path of the file to load from. """ f = h5py.File(path,'r') if 'Fixmat' in f: fm_group = f['Fixmat'] else: fm_group = f['Datamat'] fields = {} params = {} for field, va...
Load fixmat at path. Parameters: path : string Absolute path of the file to load from.
entailment
def compute_fdm(fixmat, fwhm=2, scale_factor=1): """ Computes a fixation density map for the calling fixmat. Creates a map the size of the image fixations were recorded on. Every pixel contains the frequency of fixations for this image. The fixation map is smoothed by convolution with a ...
Computes a fixation density map for the calling fixmat. Creates a map the size of the image fixations were recorded on. Every pixel contains the frequency of fixations for this image. The fixation map is smoothed by convolution with a Gaussian kernel to approximate the area with highest processi...
entailment
def relative_bias(fm, scale_factor = 1, estimator = None): """ Computes the relative bias, i.e. the distribution of saccade angles and amplitudes. Parameters: fm : DataMat The fixation data to use scale_factor : double Returns: 2D probability distribution of s...
Computes the relative bias, i.e. the distribution of saccade angles and amplitudes. Parameters: fm : DataMat The fixation data to use scale_factor : double Returns: 2D probability distribution of saccade angles and amplitudes.
entailment
def DirectoryFixmatFactory(directory, categories = None, glob_str = '*.mat', var_name = 'fixmat'): """ Concatenates all fixmats in dir and returns the resulting single fixmat. Parameters: directory : string Path from which the fixmats should be loaded categories : instan...
Concatenates all fixmats in dir and returns the resulting single fixmat. Parameters: directory : string Path from which the fixmats should be loaded categories : instance of stimuli.Categories, optional If given, the resulting fixmat provides direct access ...
entailment
def FixmatFactory(fixmatfile, categories = None, var_name = 'fixmat', field_name='x'): """ Loads a single fixmat (fixmatfile). Parameters: fixmatfile : string The matlab fixmat that should be loaded. categories : instance of stimuli.Categories, optional Links dat...
Loads a single fixmat (fixmatfile). Parameters: fixmatfile : string The matlab fixmat that should be loaded. categories : instance of stimuli.Categories, optional Links data in categories to data in fixmat.
entailment
def add_feature_values(self, features): """ Adds feature values of feature 'feature' to all fixations in the calling fixmat. For fixations out of the image boundaries, NaNs are returned. The function generates a new attribute field named with the string in featu...
Adds feature values of feature 'feature' to all fixations in the calling fixmat. For fixations out of the image boundaries, NaNs are returned. The function generates a new attribute field named with the string in features that contains an np.array listing feature values...
entailment
def make_reg_data(self, feature_list=None, all_controls=False): """ Generates two M x N matrices with M feature values at fixations for N features. Controls are a random sample out of all non-fixated regions of an image or fixations of the same subject group on a randomly chosen ...
Generates two M x N matrices with M feature values at fixations for N features. Controls are a random sample out of all non-fixated regions of an image or fixations of the same subject group on a randomly chosen image. Fixations are pooled over all subjects in the calling fixmat. ...
entailment
def get_velocity(samplemat, Hz, blinks=None): ''' Compute velocity of eye-movements. Samplemat must contain fields 'x' and 'y', specifying the x,y coordinates of gaze location. The function assumes that the values in x,y are sampled continously at a rate specified by 'Hz'. ''' Hz = float(Hz...
Compute velocity of eye-movements. Samplemat must contain fields 'x' and 'y', specifying the x,y coordinates of gaze location. The function assumes that the values in x,y are sampled continously at a rate specified by 'Hz'.
entailment
def saccade_detection(samplemat, Hz=200, threshold=30, acc_thresh=2000, min_duration=21, min_movement=.35, ignore_blinks=False): ''' Detect saccades in a stream of gaze location samples. Coordinates in samplemat are assumed to be in degrees. Saccades are det...
Detect saccades in a stream of gaze location samples. Coordinates in samplemat are assumed to be in degrees. Saccades are detect by a velocity/acceleration threshold approach. A saccade starts when a) the velocity is above threshold, b) the acceleration is above acc_thresh at least once during the int...
entailment
def fixation_detection(samplemat, saccades, Hz=200, samples2fix=None, respect_trial_borders=False, sample_times=None): ''' Detect Fixation from saccades. Fixations are defined as intervals between saccades. This function also calcuates start and end times (in ms) for each fixatio...
Detect Fixation from saccades. Fixations are defined as intervals between saccades. This function also calcuates start and end times (in ms) for each fixation. Input: samplemat: datamat Contains the recorded samples and associated metadata. saccades: ndarray Logical ...
entailment
def parse(parse_obj, agent=None, etag=None, modified=None, inject=False): """Parse a subscription list and return a dict containing the results. :param parse_obj: A file-like object or a string containing a URL, an absolute or relative filename, or an XML document. :type parse_obj: str or file ...
Parse a subscription list and return a dict containing the results. :param parse_obj: A file-like object or a string containing a URL, an absolute or relative filename, or an XML document. :type parse_obj: str or file :param agent: User-Agent header to be sent when requesting a URL :type agent:...
entailment
def FixmatStimuliFactory(fm, loader): """ Constructs an categories object for all image / category combinations in the fixmat. Parameters: fm: FixMat Used for extracting valid category/image combination. loader: loader Loader that accesses the stimuli for th...
Constructs an categories object for all image / category combinations in the fixmat. Parameters: fm: FixMat Used for extracting valid category/image combination. loader: loader Loader that accesses the stimuli for this fixmat Returns: Categories object
entailment
def DirectoryStimuliFactory(loader): """ Takes an input path to the images folder of an experiment and generates automatically the category - filenumber list needed to construct an appropriate _categories object. Parameters : loader : Loader object which contains impath : s...
Takes an input path to the images folder of an experiment and generates automatically the category - filenumber list needed to construct an appropriate _categories object. Parameters : loader : Loader object which contains impath : string path to the input, i.e. ima...
entailment
def fixations(self): ''' Filter the fixmat such that it only contains fixations on images in categories that are also in the categories object''' if not self._fixations: raise RuntimeError('This Images object does not have' +' an associated fixmat') if len(lis...
Filter the fixmat such that it only contains fixations on images in categories that are also in the categories object
entailment
def data(self, value): """ Saves a new image to disk """ self.loader.save_image(self.category, self.image, value)
Saves a new image to disk
entailment
def fixations(self): """ Returns all fixations that are on this image. A precondition for this to work is that a fixmat is associated with this Image object. """ if not self._fixations: raise RuntimeError('This Images object does not have' +' ...
Returns all fixations that are on this image. A precondition for this to work is that a fixmat is associated with this Image object.
entailment
def generate(self): """ Generator for creating the cross-validation slices. Returns A tuple of that contains two fixmats (training and test) and two Category objects (test and train). """ for _ in range(0, self.num_slices): #1. separate fixm...
Generator for creating the cross-validation slices. Returns A tuple of that contains two fixmats (training and test) and two Category objects (test and train).
entailment
def prepare_data(fm, max_back, dur_cap=700): ''' Computes angle and length differences up to given order and deletes suspiciously long fixations. Input fm: Fixmat Fixmat for which to comput angle and length differences max_back: Int Computes delta angle and ampli...
Computes angle and length differences up to given order and deletes suspiciously long fixations. Input fm: Fixmat Fixmat for which to comput angle and length differences max_back: Int Computes delta angle and amplitude up to order max_back. dur_cap: Int ...
entailment
def saccadic_momentum_effect(durations, forward_angle, summary_stat=nanmean): """ Computes the mean fixation duration at forward angles. """ durations_per_da = np.nan * np.ones((len(e_angle) - 1,)) for i, (bo, b1) in enumerate(zip(e_angle[:-1], e_angle[1:])): idx...
Computes the mean fixation duration at forward angles.
entailment
def ior_effect(durations, angle_diffs, length_diffs, summary_stat=np.mean, parallel=True, min_samples=20): """ Computes a measure of fixation durations at delta angle and delta length combinations. """ raster = np.empty((len(e_dist) - 1, len(e_angle) - 1), dtype=object) for a, (a_...
Computes a measure of fixation durations at delta angle and delta length combinations.
entailment
def predict_fixation_duration( durations, angles, length_diffs, dataset=None, params=None): """ Fits a non-linear piecewise regression to fixtaion durations for a fixmat. Returns corrected fixation durations. """ if dataset is None: dataset = np.ones(durations.shape) corrected_d...
Fits a non-linear piecewise regression to fixtaion durations for a fixmat. Returns corrected fixation durations.
entailment
def subject_predictions(fm, field='SUBJECTINDEX', method=predict_fixation_duration, data=None): ''' Calculates the saccadic momentum effect for individual subjects. Removes any effect of amplitude differences. The parameters are fitted on unbinned data. The effects are comp...
Calculates the saccadic momentum effect for individual subjects. Removes any effect of amplitude differences. The parameters are fitted on unbinned data. The effects are computed on binned data. See e_dist and e_angle for the binning parameter.
entailment
def intersubject_scores(fm, category, predicting_filenumbers, predicting_subjects, predicted_filenumbers, predicted_subjects, controls = True, scale_factor = 1): """ Calculates how well the fixations from a set of subjects on a set of images can be predicted w...
Calculates how well the fixations from a set of subjects on a set of images can be predicted with the fixations from another set of subjects on another set of images. The prediction is carried out by computing a fixation density map from fixations of predicting_subjects subjects on predicting_images im...
entailment
def intersubject_scores_random_subjects(fm, category, filenumber, n_train, n_predict, controls=True, scale_factor = 1): """ Calculates how well the fixations of n random subjects on one image can be predicted with the fixations ...
Calculates how well the fixations of n random subjects on one image can be predicted with the fixations of m other random subjects. Notes Function that uses intersubject_auc for computing auc. Parameters fm : fixmat instance category : int Category from which the fixati...
entailment
def upper_bound(fm, nr_subs = None, scale_factor = 1): """ compute the inter-subject consistency upper bound for a fixmat. Input: fm : a fixmat instance nr_subs : the number of subjects used for the prediction. Defaults to the total number of subjects in the fixmat minus 1...
compute the inter-subject consistency upper bound for a fixmat. Input: fm : a fixmat instance nr_subs : the number of subjects used for the prediction. Defaults to the total number of subjects in the fixmat minus 1 scale_factor : the scale factor of the FDMs. Default is 1....
entailment
def lower_bound(fm, nr_subs = None, nr_imgs = None, scale_factor = 1): """ Compute the spatial bias lower bound for a fixmat. Input: fm : a fixmat instance nr_subs : the number of subjects used for the prediction. Defaults to the total number of subjects in the fixmat minu...
Compute the spatial bias lower bound for a fixmat. Input: fm : a fixmat instance nr_subs : the number of subjects used for the prediction. Defaults to the total number of subjects in the fixmat minus 1 nr_imgs : the number of images used for prediction. If given, the ...
entailment
def ind2sub(ind, dimensions): """ Calculates subscripts for indices into regularly spaced matrixes. """ # check that the index is within range if ind >= np.prod(dimensions): raise RuntimeError("ind2sub: index exceeds array size") cum_dims = list(dimensions) cum_dims.reverse() m =...
Calculates subscripts for indices into regularly spaced matrixes.
entailment
def sub2ind(indices, dimensions): """ An exemplary sub2ind implementation to create randomization scripts. This function calculates indices from subscripts into regularly spaced matrixes. """ # check that none of the indices exceeds the size of the array if any([i > j for i, j in zip(...
An exemplary sub2ind implementation to create randomization scripts. This function calculates indices from subscripts into regularly spaced matrixes.
entailment
def RestoreTaskStoreFactory(store_class, chunk_size, restore_file, save_file): """ Restores a task store from file. """ intm_results = np.load(restore_file) intm = intm_results[intm_results.files[0]] idx = np.isnan(intm).flatten().nonzero()[0] partitions = math.ceil(len(idx) / float(chunk_si...
Restores a task store from file.
entailment
def xmlrpc_reschedule(self): """ Reschedule all running tasks. """ if not len(self.scheduled_tasks) == 0: self.reschedule = list(self.scheduled_tasks.items()) self.scheduled_tasks = {} return True
Reschedule all running tasks.
entailment
def xmlrpc_get_task(self): """ Return a new task description: ID and necessary parameters, all are given in a dictionary """ try: if len(self.reschedule) == 0: (task_id, cur_task) = next(self.task_iterator) else: (task_id, ...
Return a new task description: ID and necessary parameters, all are given in a dictionary
entailment
def xmlrpc_task_done(self, result): """ Take the results of a computation and put it into the results list. """ (task_id, task_results) = result del self.scheduled_tasks[task_id] self.task_store.update_results(task_id, task_results) self.results += 1 retur...
Take the results of a computation and put it into the results list.
entailment
def xmlrpc_status(self): """ Return a status message """ return (""" %i Jobs are still wating for execution %i Jobs are being processed %i Jobs are done """ %(self.task_store.partitions - self.results - len(self.scheduled_...
Return a status message
entailment
def xmlrpc_save2file(self, filename): """ Save results and own state into file. """ savefile = open(filename,'wb') try: pickle.dump({'scheduled':self.scheduled_tasks, 'reschedule':self.reschedule},savefile) except pickle.PicklingError...
Save results and own state into file.
entailment
def run(self): """This function needs to be called to start the computation.""" (task_id, tasks) = self.server.get_task() self.task_store.from_dict(tasks) for (index, task) in self.task_store: result = self.compute(index, task) self.results.append(result) ...
This function needs to be called to start the computation.
entailment
def from_dict(self, description): """Configures the task store to be the task_store described in description""" assert(self.ident == description['ident']) self.partitions = description['partitions'] self.indices = description['indices']
Configures the task store to be the task_store described in description
entailment
def partition(self): """Partitions all tasks into groups of tasks. A group is represented by a task_store object that indexes a sub- set of tasks.""" step = int(math.ceil(self.num_tasks / float(self.partitions))) if self.indices == None: slice_ind = list(range(0...
Partitions all tasks into groups of tasks. A group is represented by a task_store object that indexes a sub- set of tasks.
entailment
def fit3d(samples, e_x, e_y, e_z, remove_zeros = False, **kw): """Fits a 3D distribution with splines. Input: samples: Array Array of samples from a probability distribution e_x: Array Edges that define the events in the probability distribution along the x ...
Fits a 3D distribution with splines. Input: samples: Array Array of samples from a probability distribution e_x: Array Edges that define the events in the probability distribution along the x direction. For example, e_x[0] < samples[0] <= e_x[1] pic...
entailment
def fit2d(samples,e_x, e_y, remove_zeros = False, p_est = None, **kw): """Fits a 2D distribution with splines. Input: samples: Matrix or list of arrays If matrix, it must be of size Nx2, where N is the number of observations. If list, it must contain two arrays of length ...
Fits a 2D distribution with splines. Input: samples: Matrix or list of arrays If matrix, it must be of size Nx2, where N is the number of observations. If list, it must contain two arrays of length N. e_x: Array Edges that define the events in the pr...
entailment
def fit1d(samples, e, remove_zeros = False, **kw): """Fits a 1D distribution with splines. Input: samples: Array Array of samples from a probability distribution e: Array Edges that define the events in the probability distribution. For example, e[0] < x <= ...
Fits a 1D distribution with splines. Input: samples: Array Array of samples from a probability distribution e: Array Edges that define the events in the probability distribution. For example, e[0] < x <= e[1] is the range of values that are associate...
entailment
def knots_from_marginal(marginal, nr_knots, spline_order): """ Determines knot placement based on a marginal distribution. It places knots such that each knot covers the same amount of probability mass. Two of the knots are reserved for the borders which are treated seperatly. For example, a uni...
Determines knot placement based on a marginal distribution. It places knots such that each knot covers the same amount of probability mass. Two of the knots are reserved for the borders which are treated seperatly. For example, a uniform distribution with 5 knots will cause the knots to be equally ...
entailment
def spline_base1d(length, nr_knots = 20, spline_order = 5, marginal = None): """Computes a 1D spline basis Input: length: int length of each basis nr_knots: int Number of knots, i.e. number of basis functions. spline_order: int Order of the splin...
Computes a 1D spline basis Input: length: int length of each basis nr_knots: int Number of knots, i.e. number of basis functions. spline_order: int Order of the splines. marginal: array, optional Estimate of the marginal distribut...
entailment
def spline_base2d(width, height, nr_knots_x = 20.0, nr_knots_y = 20.0, spline_order = 5, marginal_x = None, marginal_y = None): """Computes a set of 2D spline basis functions. The basis functions cover the entire space in height*width and can for example be used to create fixation density ma...
Computes a set of 2D spline basis functions. The basis functions cover the entire space in height*width and can for example be used to create fixation density maps. Input: width: int width of each basis height: int height of each basis nr_knots_x: i...
entailment
def spline_base3d( width, height, depth, nr_knots_x = 10.0, nr_knots_y = 10.0, nr_knots_z=10, spline_order = 3, marginal_x = None, marginal_y = None, marginal_z = None): """Computes a set of 3D spline basis functions. For a description of the parameters see spline_base2d. """ if...
Computes a set of 3D spline basis functions. For a description of the parameters see spline_base2d.
entailment
def spline(x,knots,p,i=0.0): """Evaluates the ith spline basis given by knots on points in x""" assert(p+1<len(knots)) return np.array([N(float(u),float(i),float(p),knots) for u in x])
Evaluates the ith spline basis given by knots on points in x
entailment
def spcol(x,knots,spline_order): """Computes the spline colocation matrix for knots in x. The spline collocation matrix contains all m-p-1 bases defined by knots. Specifically it contains the ith basis in the ith column. Input: x: vector to evaluate the bases on knots: vec...
Computes the spline colocation matrix for knots in x. The spline collocation matrix contains all m-p-1 bases defined by knots. Specifically it contains the ith basis in the ith column. Input: x: vector to evaluate the bases on knots: vector of knots spline_order: orde...
entailment
def augknt(knots,order): """Augment knot sequence such that some boundary conditions are met.""" a = [] [a.append(knots[0]) for t in range(0,order)] [a.append(k) for k in knots] [a.append(knots[-1]) for t in range(0,order)] return np.array(a)
Augment knot sequence such that some boundary conditions are met.
entailment
def N(u,i,p,knots): """Compute Spline Basis Evaluates the spline basis of order p defined by knots at knot i and point u. """ if p == 0: if knots[i] < u and u <=knots[i+1]: return 1.0 else: return 0.0 else: try: k = (( float((u-kn...
Compute Spline Basis Evaluates the spline basis of order p defined by knots at knot i and point u.
entailment
def prediction_scores(prediction, fm, **kw): """ Evaluates a prediction against fixations in a fixmat with different measures. The default measures which are used are AUC, NSS and KL-divergence. This can be changed by setting the list of measures with set_scores. As different measures need potentia...
Evaluates a prediction against fixations in a fixmat with different measures. The default measures which are used are AUC, NSS and KL-divergence. This can be changed by setting the list of measures with set_scores. As different measures need potentially different parameters, the kw dictionary can be us...
entailment
def kldiv_model(prediction, fm): """ wraps kldiv functionality for model evaluation input: prediction: 2D matrix the model salience map fm : fixmat Should be filtered for the image corresponding to the prediction """ (_, r_x) = calc_resize_factor(prediction, ...
wraps kldiv functionality for model evaluation input: prediction: 2D matrix the model salience map fm : fixmat Should be filtered for the image corresponding to the prediction
entailment
def kldiv(p, q, distp = None, distq = None, scale_factor = 1): """ Computes the Kullback-Leibler divergence between two distributions. Parameters p : Matrix The first probability distribution q : Matrix The second probability distribution distp : fixmat ...
Computes the Kullback-Leibler divergence between two distributions. Parameters p : Matrix The first probability distribution q : Matrix The second probability distribution distp : fixmat If p is None, distp is used to compute a FDM which is th...
entailment
def kldiv_cs_model(prediction, fm): """ Computes Chao-Shen corrected KL-divergence between prediction and fdm made from fixations in fm. Parameters : prediction : np.ndarray a fixation density map fm : FixMat object """ # compute histogram of fixations needed for Cha...
Computes Chao-Shen corrected KL-divergence between prediction and fdm made from fixations in fm. Parameters : prediction : np.ndarray a fixation density map fm : FixMat object
entailment
def chao_shen(q): """ Computes some terms needed for the Chao-Shen KL correction. """ yx = q[q > 0] # remove bins with zero counts n = np.sum(yx) p = yx.astype(float)/n f1 = np.sum(yx == 1) # number of singletons in the sample if f1 == n: # avoid C == 0 f1 -= 1 C = 1 - (f1/n)...
Computes some terms needed for the Chao-Shen KL correction.
entailment
def correlation_model(prediction, fm): """ wraps numpy.corrcoef functionality for model evaluation input: prediction: 2D Matrix the model salience map fm: fixmat Used to compute a FDM to which the prediction is compared. """ (_, r_x) = calc_resize_factor(pred...
wraps numpy.corrcoef functionality for model evaluation input: prediction: 2D Matrix the model salience map fm: fixmat Used to compute a FDM to which the prediction is compared.
entailment
def nss_model(prediction, fm): """ wraps nss functionality for model evaluation input: prediction: 2D matrix the model salience map fm : fixmat Fixations that define the actuals """ (r_y, r_x) = calc_resize_factor(prediction, fm.image_size) fix = ((np.arr...
wraps nss functionality for model evaluation input: prediction: 2D matrix the model salience map fm : fixmat Fixations that define the actuals
entailment
def nss(prediction, fix): """ Compute the normalized scanpath salience input: fix : list, l[0] contains y, l[1] contains x """ prediction = prediction - np.mean(prediction) prediction = prediction / np.std(prediction) return np.mean(prediction[fix[0], fix[1]])
Compute the normalized scanpath salience input: fix : list, l[0] contains y, l[1] contains x
entailment
def roc_model(prediction, fm, ctr_loc = None, ctr_size = None): """ wraps roc functionality for model evaluation Parameters: prediction: 2D array the model salience map fm : fixmat Fixations that define locations of the actuals ctr_loc : tuple of (y.x) coordi...
wraps roc functionality for model evaluation Parameters: prediction: 2D array the model salience map fm : fixmat Fixations that define locations of the actuals ctr_loc : tuple of (y.x) coordinates, optional Allows to specify control points for spatial ...
entailment
def fast_roc(actuals, controls): """ approximates the area under the roc curve for sets of actuals and controls. Uses all values appearing in actuals as thresholds and lower sum interpolation. Also returns arrays of the true positive rate and the false positive rate that can be used for plotting the...
approximates the area under the roc curve for sets of actuals and controls. Uses all values appearing in actuals as thresholds and lower sum interpolation. Also returns arrays of the true positive rate and the false positive rate that can be used for plotting the roc curve. Parameters: actuals ...
entailment
def faster_roc(actuals, controls): """ Histogram based implementation of AUC unde ROC curve. Parameters: actuals : list A list of numeric values for positive observations. controls : list A list of numeric values for negative observations. """ assert(type(actu...
Histogram based implementation of AUC unde ROC curve. Parameters: actuals : list A list of numeric values for positive observations. controls : list A list of numeric values for negative observations.
entailment
def emd_model(prediction, fm): """ wraps emd functionality for model evaluation requires: OpenCV python bindings input: prediction: the model salience map fm : fixmat filtered for the image corresponding to the prediction """ (_, r_x) = calc_resize_factor(prediction, fm...
wraps emd functionality for model evaluation requires: OpenCV python bindings input: prediction: the model salience map fm : fixmat filtered for the image corresponding to the prediction
entailment
def emd(prediction, ground_truth): """ Compute the Eart Movers Distance between prediction and model. This implementation uses opencv for doing the actual work. Unfortunately, at the time of implementation only the SWIG bindings werer available and the numpy arrays have to converted by hand. Th...
Compute the Eart Movers Distance between prediction and model. This implementation uses opencv for doing the actual work. Unfortunately, at the time of implementation only the SWIG bindings werer available and the numpy arrays have to converted by hand. This changes with opencv 2.1.
entailment
def _rfc822(date): """Parse RFC 822 dates and times http://tools.ietf.org/html/rfc822#section-5 There are some formatting differences that are accounted for: 1. Years may be two or four digits. 2. The month and day can be swapped. 3. Additional timezone names are supported. 4. A default tim...
Parse RFC 822 dates and times http://tools.ietf.org/html/rfc822#section-5 There are some formatting differences that are accounted for: 1. Years may be two or four digits. 2. The month and day can be swapped. 3. Additional timezone names are supported. 4. A default time and timezone are assumed...
entailment
def _to_rfc822(date): """_to_rfc822(datetime.datetime) -> str The datetime `strftime` method is subject to locale-specific day and month names, so this function hardcodes the conversion.""" months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec'] da...
_to_rfc822(datetime.datetime) -> str The datetime `strftime` method is subject to locale-specific day and month names, so this function hardcodes the conversion.
entailment
def format(self, sql, params): """ Formats the SQL query to use ordinal parameters instead of named parameters. *sql* (|string|) is the SQL query. *params* (|dict|) maps each named parameter (|string|) to value (|object|). If |self.named| is "numeric", then *params* can be simply a |sequence| of values ...
Formats the SQL query to use ordinal parameters instead of named parameters. *sql* (|string|) is the SQL query. *params* (|dict|) maps each named parameter (|string|) to value (|object|). If |self.named| is "numeric", then *params* can be simply a |sequence| of values mapped by index. Returns a 2-|tuple|...
entailment
def formatmany(self, sql, many_params): """ Formats the SQL query to use ordinal parameters instead of named parameters. *sql* (|string|) is the SQL query. *many_params* (|iterable|) contains each *params* to format. - *params* (|dict|) maps each named parameter (|string|) to value (|object|). If |se...
Formats the SQL query to use ordinal parameters instead of named parameters. *sql* (|string|) is the SQL query. *many_params* (|iterable|) contains each *params* to format. - *params* (|dict|) maps each named parameter (|string|) to value (|object|). If |self.named| is "numeric", then *params* can be ...
entailment
def _get_parser(f): """ Gets the parser for the command f, if it not exists it creates a new one """ _COMMAND_GROUPS[f.__module__].load() if f.__name__ not in _COMMAND_GROUPS[f.__module__].parsers: parser = _COMMAND_GROUPS[f.__module__].parser_generator.add_parser(f.__name__, help=f.__doc__...
Gets the parser for the command f, if it not exists it creates a new one
entailment
def findMentions(sourceURL, targetURL=None, exclude_domains=[], content=None, test_urls=True, headers={}, timeout=None): """Find all <a /> elements in the given html for a post. Only scan html element matching all criteria in look_in. optionally the content to be scanned can be given as an argument. If an...
Find all <a /> elements in the given html for a post. Only scan html element matching all criteria in look_in. optionally the content to be scanned can be given as an argument. If any have an href attribute that is not from the one of the items in exclude_domains, append it to our lists. :param sourc...
entailment
def findEndpoint(html): """Search the given html content for all <link /> elements and return any discovered WebMention URL. :param html: html content :rtype: WebMention URL """ poss_rels = ['webmention', 'http://webmention.org', 'http://webmention.org/', 'https://webmention.org', 'https://webm...
Search the given html content for all <link /> elements and return any discovered WebMention URL. :param html: html content :rtype: WebMention URL
entailment
def discoverEndpoint(url, test_urls=True, headers={}, timeout=None, request=None, debug=False): """Discover any WebMention endpoint for a given URL. :param link: URL to discover WebMention endpoint :param test_urls: optional flag to test URLs for validation :param headers: optional headers to send with...
Discover any WebMention endpoint for a given URL. :param link: URL to discover WebMention endpoint :param test_urls: optional flag to test URLs for validation :param headers: optional headers to send with any web requests :type headers dict :param timeout: optional timeout for web requests :typ...
entailment
def sendWebmention(sourceURL, targetURL, webmention=None, test_urls=True, vouchDomain=None, headers={}, timeout=None, debug=False): """Send to the :targetURL: a WebMention for the :sourceURL: The WebMention will be discovered if not given in the :webmention: parameter. :param source...
Send to the :targetURL: a WebMention for the :sourceURL: The WebMention will be discovered if not given in the :webmention: parameter. :param sourceURL: URL that is referencing :targetURL: :param targetURL: URL of mentioned post :param webmention: optional WebMention endpoint :param test_urls:...
entailment
def parse_link_header(link): """takes the link header as a string and returns a dictionary with rel values as keys and urls as values :param link: link header as a string :rtype: dictionary {rel_name: rel_value} """ rel_dict = {} for rels in link.split(','): rel_break = quoted_split(rels...
takes the link header as a string and returns a dictionary with rel values as keys and urls as values :param link: link header as a string :rtype: dictionary {rel_name: rel_value}
entailment
def findRelMe(sourceURL): """Find all <a /> elements in the given html for a post. If any have an href attribute that is rel="me" then include it in the result. :param sourceURL: the URL for the post we are scanning :rtype: dictionary of RelMe references """ r = requests.get(sourceURL) ...
Find all <a /> elements in the given html for a post. If any have an href attribute that is rel="me" then include it in the result. :param sourceURL: the URL for the post we are scanning :rtype: dictionary of RelMe references
entailment
def confirmRelMe(profileURL, resourceURL, profileRelMes=None, resourceRelMes=None): """Determine if a given :resourceURL: is authoritative for the :profileURL: TODO add https/http filtering for those who wish to limit/restrict urls to match fully TODO add code to ensure that each item in the redirect chain...
Determine if a given :resourceURL: is authoritative for the :profileURL: TODO add https/http filtering for those who wish to limit/restrict urls to match fully TODO add code to ensure that each item in the redirect chain is authoritative :param profileURL: URL of the user :param resourceURL: URL of th...
entailment
def indent_text(string, indent_level=2): """Indent every line of text in a newline-delimited string""" indented_lines = [] indent_spaces = ' ' * indent_level for line in string.split('\n'): indented_lines.append(indent_spaces + line) return '\n'.join(indented_lines)
Indent every line of text in a newline-delimited string
entailment
def download(url, target, headers=None, trackers=()): """Download a file using requests. This is like urllib.request.urlretrieve, but: - requests validates SSL certificates by default - you can pass tracker objects to e.g. display a progress bar or calculate a file hash. """ if headers i...
Download a file using requests. This is like urllib.request.urlretrieve, but: - requests validates SSL certificates by default - you can pass tracker objects to e.g. display a progress bar or calculate a file hash.
entailment
def write(parsed_obj, spec=None, filename=None): """Writes an object created by `parse` to either a file or a bytearray. If the object doesn't end on a byte boundary, zeroes are appended to it until it does. """ if not isinstance(parsed_obj, BreadStruct): raise ValueError( 'Obje...
Writes an object created by `parse` to either a file or a bytearray. If the object doesn't end on a byte boundary, zeroes are appended to it until it does.
entailment
def deploy_file(file_path, bucket): """ Uploads a file to an S3 bucket, as a public file. """ # Paths look like: # index.html # css/bootstrap.min.css logger.info("Deploying {0}".format(file_path)) # Upload the actual file to file_path k = Key(bucket) k.key = file_path try: ...
Uploads a file to an S3 bucket, as a public file.
entailment
def deploy(www_dir, bucket_name): """ Deploy to the configured S3 bucket. """ # Set up the connection to an S3 bucket. conn = boto.connect_s3() bucket = conn.get_bucket(bucket_name) # Deploy each changed file in www_dir os.chdir(www_dir) for root, dirs, files in os.walk('.'): for f...
Deploy to the configured S3 bucket.
entailment
def has_changed_since_last_deploy(file_path, bucket): """ Checks if a file has changed since the last time it was deployed. :param file_path: Path to file which should be checked. Should be relative from root of bucket. :param bucket_name: Name of S3 bucket to check against. :...
Checks if a file has changed since the last time it was deployed. :param file_path: Path to file which should be checked. Should be relative from root of bucket. :param bucket_name: Name of S3 bucket to check against. :returns: True if the file has changed, else False.
entailment
def main(): """ Entry point for the package, as defined in setup.py. """ # Log info and above to console logging.basicConfig( format='%(levelname)s: %(message)s', level=logging.INFO) # Get command line input/output arguments msg = 'Instantly deploy static HTML sites to S3 at the command li...
Entry point for the package, as defined in setup.py.
entailment
def start_sikuli_process(self, port=None): """ This keyword is used to start sikuli java process. If library is inited with mode "OLD", sikuli java process is started automatically. If library is inited with mode "NEW", this keyword should be used. :param port: port of sikuli ja...
This keyword is used to start sikuli java process. If library is inited with mode "OLD", sikuli java process is started automatically. If library is inited with mode "NEW", this keyword should be used. :param port: port of sikuli java process, if value is None or 0, a random free port will be u...
entailment
def post(self, request): """Respond to POSTed username/password with token.""" serializer = AuthTokenSerializer(data=request.data) if serializer.is_valid(): token, _ = ExpiringToken.objects.get_or_create( user=serializer.validated_data['user'] ) ...
Respond to POSTed username/password with token.
entailment
def EXPIRING_TOKEN_LIFESPAN(self): """ Return the allowed lifespan of a token as a TimeDelta object. Defaults to 30 days. """ try: val = settings.EXPIRING_TOKEN_LIFESPAN except AttributeError: val = timedelta(days=30) return val
Return the allowed lifespan of a token as a TimeDelta object. Defaults to 30 days.
entailment
def expired(self): """Return boolean indicating token expiration.""" now = timezone.now() if self.created < now - token_settings.EXPIRING_TOKEN_LIFESPAN: return True return False
Return boolean indicating token expiration.
entailment
def unicode_is_punctuation(text): """ Test if a token is made entirely of Unicode characters of the following classes: - P: punctuation - S: symbols - Z: separators - M: combining marks - C: control characters >>> unicode_is_punctuation('word') False >>> unicode_is_punctuat...
Test if a token is made entirely of Unicode characters of the following classes: - P: punctuation - S: symbols - Z: separators - M: combining marks - C: control characters >>> unicode_is_punctuation('word') False >>> unicode_is_punctuation('。') True >>> unicode_is_punctuati...
entailment
def process(self): """ Store the actual process in _process. If it doesn't exist yet, create it. """ if hasattr(self, '_process'): return self._process else: self._process = self._get_process() return self._process
Store the actual process in _process. If it doesn't exist yet, create it.
entailment
def _get_process(self): """ Create the process by running the specified command. """ command = self._get_command() return subprocess.Popen(command, bufsize=-1, close_fds=True, stdout=subprocess.PIPE, stdin=subprocess...
Create the process by running the specified command.
entailment
def tokenize_list(self, text): """ Split a text into separate words. """ return [self.get_record_token(record) for record in self.analyze(text)]
Split a text into separate words.
entailment
def is_stopword(self, text): """ Determine whether a single word is a stopword, or whether a short phrase is made entirely of stopwords, disregarding context. Use of this function should be avoided; it's better to give the text in context and let the process determine which word...
Determine whether a single word is a stopword, or whether a short phrase is made entirely of stopwords, disregarding context. Use of this function should be avoided; it's better to give the text in context and let the process determine which words are the stopwords.
entailment
def normalize_list(self, text, cache=None): """ Get a canonical list representation of text, with words separated and reduced to their base forms. TODO: use the cache. """ words = [] analysis = self.analyze(text) for record in analysis: if not...
Get a canonical list representation of text, with words separated and reduced to their base forms. TODO: use the cache.
entailment